Apr 23 13:29:35.829587 ip-10-0-134-22 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 23 13:29:35.829601 ip-10-0-134-22 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 23 13:29:35.829611 ip-10-0-134-22 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 23 13:29:35.829983 ip-10-0-134-22 systemd[1]: Failed to start Kubernetes Kubelet. Apr 23 13:29:45.898189 ip-10-0-134-22 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 23 13:29:45.898208 ip-10-0-134-22 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 2dfb0e83f02b47608e6ea9a100430319 -- Apr 23 13:31:48.409698 ip-10-0-134-22 systemd[1]: Starting Kubernetes Kubelet... Apr 23 13:31:48.920299 ip-10-0-134-22 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 13:31:48.920299 ip-10-0-134-22 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 13:31:48.920299 ip-10-0-134-22 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 13:31:48.920299 ip-10-0-134-22 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 13:31:48.920299 ip-10-0-134-22 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 13:31:48.921978 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.921892 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 13:31:48.928860 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.928836 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:31:48.928860 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.928855 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:31:48.928860 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.928859 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:31:48.928860 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.928862 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:31:48.928860 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.928866 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:31:48.928860 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.928869 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:31:48.929079 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.928872 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:31:48.929079 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.928874 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:31:48.929079 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.928877 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:31:48.929079 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.928880 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:31:48.929079 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.928883 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:31:48.929079 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.928885 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:31:48.929079 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.928888 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:31:48.929079 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.928891 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:31:48.929079 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.928893 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:31:48.929079 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.928896 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:31:48.929079 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.928899 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:31:48.929079 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.928901 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:31:48.929079 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.928903 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:31:48.929079 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.928906 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:31:48.929079 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.928909 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:31:48.929079 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.928911 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:31:48.929079 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.928914 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:31:48.929079 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.928918 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:31:48.929079 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.928922 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:31:48.929564 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.928925 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:31:48.929564 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.928928 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:31:48.929564 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.928934 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:31:48.929564 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.928937 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:31:48.929564 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.928940 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:31:48.929564 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.928942 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:31:48.929564 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.928945 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:31:48.929564 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.928948 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:31:48.929564 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.928951 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:31:48.929564 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.928954 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:31:48.929564 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.928956 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:31:48.929564 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.928959 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:31:48.929564 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.928961 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:31:48.929564 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.928964 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:31:48.929564 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.928968 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:31:48.929564 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.928971 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:31:48.929564 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.928973 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:31:48.929564 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.928976 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:31:48.929564 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.928979 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:31:48.929564 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.928982 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:31:48.930040 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.928984 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:31:48.930040 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.928987 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:31:48.930040 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.928990 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:31:48.930040 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.928992 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:31:48.930040 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.928996 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:31:48.930040 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.928999 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:31:48.930040 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929001 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:31:48.930040 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929004 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:31:48.930040 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929007 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:31:48.930040 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929010 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:31:48.930040 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929012 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:31:48.930040 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929015 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:31:48.930040 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929017 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:31:48.930040 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929020 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:31:48.930040 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929022 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:31:48.930040 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929025 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:31:48.930040 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929028 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:31:48.930040 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929030 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:31:48.930040 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929033 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:31:48.930532 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929035 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:31:48.930532 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929039 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:31:48.930532 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929041 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:31:48.930532 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929044 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:31:48.930532 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929046 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:31:48.930532 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929050 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:31:48.930532 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929054 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:31:48.930532 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929058 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:31:48.930532 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929061 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:31:48.930532 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929063 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:31:48.930532 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929066 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:31:48.930532 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929069 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:31:48.930532 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929071 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:31:48.930532 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929074 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:31:48.930532 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929086 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:31:48.930532 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929089 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:31:48.930532 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929092 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:31:48.930532 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929095 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:31:48.930532 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929097 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:31:48.930993 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929100 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:31:48.930993 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929102 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:31:48.930993 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929105 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:31:48.930993 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929537 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:31:48.930993 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929542 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:31:48.930993 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929545 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:31:48.930993 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929548 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:31:48.930993 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929550 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:31:48.930993 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929553 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:31:48.930993 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929556 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:31:48.930993 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929562 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:31:48.930993 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929565 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:31:48.930993 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929567 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:31:48.930993 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929570 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:31:48.930993 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929572 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:31:48.930993 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929575 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:31:48.930993 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929578 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:31:48.930993 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929581 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:31:48.930993 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929583 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:31:48.930993 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929586 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:31:48.931531 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929590 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:31:48.931531 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929594 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:31:48.931531 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929596 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:31:48.931531 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929599 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:31:48.931531 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929602 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:31:48.931531 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929605 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:31:48.931531 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929608 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:31:48.931531 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929616 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:31:48.931531 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929618 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:31:48.931531 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929621 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:31:48.931531 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929623 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:31:48.931531 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929626 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:31:48.931531 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929628 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:31:48.931531 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929631 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:31:48.931531 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929633 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:31:48.931531 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929637 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:31:48.931531 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929641 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:31:48.931531 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929644 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:31:48.931531 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929646 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:31:48.932011 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929649 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:31:48.932011 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929651 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:31:48.932011 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929653 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:31:48.932011 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929656 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:31:48.932011 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929659 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:31:48.932011 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929661 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:31:48.932011 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929663 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:31:48.932011 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929666 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:31:48.932011 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929668 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:31:48.932011 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929671 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:31:48.932011 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929674 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:31:48.932011 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929677 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:31:48.932011 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929679 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:31:48.932011 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929682 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:31:48.932011 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929685 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:31:48.932011 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929688 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:31:48.932011 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929690 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:31:48.932011 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929693 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:31:48.932011 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929695 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:31:48.932011 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929698 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:31:48.932510 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929701 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:31:48.932510 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929710 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:31:48.932510 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929713 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:31:48.932510 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929717 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:31:48.932510 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929720 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:31:48.932510 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929723 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:31:48.932510 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929725 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:31:48.932510 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929728 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:31:48.932510 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929730 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:31:48.932510 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929733 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:31:48.932510 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929736 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:31:48.932510 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929738 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:31:48.932510 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929741 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:31:48.932510 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929743 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:31:48.932510 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929746 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:31:48.932510 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929748 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:31:48.932510 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929751 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:31:48.932510 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929753 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:31:48.932510 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929755 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:31:48.932510 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929758 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:31:48.932998 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929761 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:31:48.932998 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929764 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:31:48.932998 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929766 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:31:48.932998 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929769 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:31:48.932998 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929772 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:31:48.932998 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929775 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:31:48.932998 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929777 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:31:48.932998 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929780 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:31:48.932998 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929782 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:31:48.932998 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.929785 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:31:48.932998 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.929859 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 13:31:48.932998 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.929866 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 13:31:48.932998 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.929877 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 13:31:48.932998 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.929882 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 13:31:48.932998 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.929893 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 13:31:48.932998 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.929896 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 13:31:48.932998 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.929901 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 13:31:48.932998 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.929906 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 13:31:48.932998 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.929909 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 13:31:48.932998 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.929912 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 13:31:48.932998 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.929915 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 13:31:48.933530 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.929919 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 13:31:48.933530 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.929922 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 13:31:48.933530 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.929924 2576 flags.go:64] FLAG: --cgroup-root="" Apr 23 13:31:48.933530 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.929927 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 13:31:48.933530 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.929930 2576 flags.go:64] FLAG: --client-ca-file="" Apr 23 13:31:48.933530 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.929934 2576 flags.go:64] FLAG: --cloud-config="" Apr 23 13:31:48.933530 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.929936 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 23 13:31:48.933530 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.929939 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 13:31:48.933530 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.929946 2576 flags.go:64] FLAG: --cluster-domain="" Apr 23 13:31:48.933530 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.929949 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 13:31:48.933530 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.929952 2576 flags.go:64] FLAG: --config-dir="" Apr 23 13:31:48.933530 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.929954 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 13:31:48.933530 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.929958 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 13:31:48.933530 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.929961 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 13:31:48.933530 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.929964 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 13:31:48.933530 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.929968 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 13:31:48.933530 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.929972 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 13:31:48.933530 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.929975 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 23 13:31:48.933530 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.929978 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 13:31:48.933530 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.929981 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 13:31:48.933530 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.929985 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 13:31:48.933530 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.929988 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 13:31:48.933530 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.929992 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 13:31:48.933530 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.929995 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 13:31:48.933530 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.929998 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 13:31:48.934134 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930000 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 13:31:48.934134 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930009 2576 flags.go:64] FLAG: --enable-server="true" Apr 23 13:31:48.934134 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930012 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 13:31:48.934134 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930018 2576 flags.go:64] FLAG: --event-burst="100" Apr 23 13:31:48.934134 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930022 2576 flags.go:64] FLAG: --event-qps="50" Apr 23 13:31:48.934134 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930030 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 13:31:48.934134 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930033 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 13:31:48.934134 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930036 2576 flags.go:64] FLAG: --eviction-hard="" Apr 23 13:31:48.934134 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930040 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 13:31:48.934134 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930043 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 13:31:48.934134 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930046 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 13:31:48.934134 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930049 2576 flags.go:64] FLAG: --eviction-soft="" Apr 23 13:31:48.934134 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930052 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 13:31:48.934134 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930055 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 13:31:48.934134 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930058 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 13:31:48.934134 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930061 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 13:31:48.934134 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930064 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 13:31:48.934134 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930067 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 13:31:48.934134 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930070 2576 flags.go:64] FLAG: --feature-gates="" Apr 23 13:31:48.934134 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930073 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 13:31:48.934134 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930076 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 13:31:48.934134 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930079 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 13:31:48.934134 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930090 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 13:31:48.934134 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930093 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 23 13:31:48.934134 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930096 2576 flags.go:64] FLAG: --help="false" Apr 23 13:31:48.934761 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930099 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-134-22.ec2.internal" Apr 23 13:31:48.934761 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930103 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 13:31:48.934761 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930106 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 13:31:48.934761 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930108 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 13:31:48.934761 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930112 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 13:31:48.934761 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930115 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 13:31:48.934761 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930118 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 13:31:48.934761 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930121 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 13:31:48.934761 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930123 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 13:31:48.934761 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930132 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 13:31:48.934761 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930135 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 13:31:48.934761 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930138 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 13:31:48.934761 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930141 2576 flags.go:64] FLAG: --kube-reserved="" Apr 23 13:31:48.934761 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930144 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 13:31:48.934761 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930147 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 13:31:48.934761 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930150 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 13:31:48.934761 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930152 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 13:31:48.934761 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930155 2576 flags.go:64] FLAG: --lock-file="" Apr 23 13:31:48.934761 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930158 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 13:31:48.934761 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930161 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 13:31:48.934761 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930164 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 13:31:48.934761 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930182 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 13:31:48.934761 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930186 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 13:31:48.935325 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930190 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 13:31:48.935325 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930193 2576 flags.go:64] FLAG: --logging-format="text" Apr 23 13:31:48.935325 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930196 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 13:31:48.935325 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930199 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 13:31:48.935325 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930202 2576 flags.go:64] FLAG: --manifest-url="" Apr 23 13:31:48.935325 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930205 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 23 13:31:48.935325 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930219 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 13:31:48.935325 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930222 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 13:31:48.935325 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930226 2576 flags.go:64] FLAG: --max-pods="110" Apr 23 13:31:48.935325 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930229 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 13:31:48.935325 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930232 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 13:31:48.935325 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930235 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 13:31:48.935325 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930238 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 13:31:48.935325 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930241 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 13:31:48.935325 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930244 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 13:31:48.935325 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930247 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 13:31:48.935325 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930254 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 13:31:48.935325 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930257 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 13:31:48.935325 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930260 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 13:31:48.935325 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930270 2576 flags.go:64] FLAG: --pod-cidr="" Apr 23 13:31:48.935325 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930273 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 13:31:48.935325 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930278 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 13:31:48.935325 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930281 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 13:31:48.935325 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930284 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 23 13:31:48.935922 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930287 2576 flags.go:64] FLAG: --port="10250" Apr 23 13:31:48.935922 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930290 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 13:31:48.935922 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930293 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-03fe9f5e738deecd7" Apr 23 13:31:48.935922 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930296 2576 flags.go:64] FLAG: --qos-reserved="" Apr 23 13:31:48.935922 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930299 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 23 13:31:48.935922 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930302 2576 flags.go:64] FLAG: --register-node="true" Apr 23 13:31:48.935922 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930305 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 23 13:31:48.935922 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930307 2576 flags.go:64] FLAG: --register-with-taints="" Apr 23 13:31:48.935922 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930311 2576 flags.go:64] FLAG: --registry-burst="10" Apr 23 13:31:48.935922 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930314 2576 flags.go:64] FLAG: --registry-qps="5" Apr 23 13:31:48.935922 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930317 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 23 13:31:48.935922 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930320 2576 flags.go:64] FLAG: --reserved-memory="" Apr 23 13:31:48.935922 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930324 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 13:31:48.935922 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930327 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 13:31:48.935922 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930330 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 13:31:48.935922 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930334 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 13:31:48.935922 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930338 2576 flags.go:64] FLAG: --runonce="false" Apr 23 13:31:48.935922 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930341 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 13:31:48.935922 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930344 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 13:31:48.935922 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930347 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 23 13:31:48.935922 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930349 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 13:31:48.935922 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930352 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 13:31:48.935922 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930355 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 13:31:48.935922 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930358 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 13:31:48.935922 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930361 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 13:31:48.935922 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930364 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 13:31:48.936599 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930367 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 13:31:48.936599 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930370 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 13:31:48.936599 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930378 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 13:31:48.936599 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930381 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 13:31:48.936599 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930384 2576 flags.go:64] FLAG: --system-cgroups="" Apr 23 13:31:48.936599 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930387 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 13:31:48.936599 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930393 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 13:31:48.936599 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930396 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 23 13:31:48.936599 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930398 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 13:31:48.936599 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930405 2576 flags.go:64] FLAG: --tls-min-version="" Apr 23 13:31:48.936599 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930408 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 13:31:48.936599 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930423 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 13:31:48.936599 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930426 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 13:31:48.936599 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930429 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 13:31:48.936599 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930432 2576 flags.go:64] FLAG: --v="2" Apr 23 13:31:48.936599 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930436 2576 flags.go:64] FLAG: --version="false" Apr 23 13:31:48.936599 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930440 2576 flags.go:64] FLAG: --vmodule="" Apr 23 13:31:48.936599 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930444 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 13:31:48.936599 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930447 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 13:31:48.936599 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930540 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:31:48.936599 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930545 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:31:48.936599 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930549 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:31:48.936599 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930552 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:31:48.936599 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930555 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:31:48.937186 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930558 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:31:48.937186 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930560 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:31:48.937186 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930563 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:31:48.937186 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930565 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:31:48.937186 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930568 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:31:48.937186 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930570 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:31:48.937186 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930573 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:31:48.937186 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930575 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:31:48.937186 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930578 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:31:48.937186 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930580 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:31:48.937186 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930583 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:31:48.937186 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930586 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:31:48.937186 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930589 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:31:48.937186 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930591 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:31:48.937186 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930594 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:31:48.937186 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930597 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:31:48.937186 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930599 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:31:48.937186 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930602 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:31:48.937186 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930604 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:31:48.937186 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930607 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:31:48.937809 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930609 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:31:48.937809 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930611 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:31:48.937809 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930614 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:31:48.937809 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930616 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:31:48.937809 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930619 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:31:48.937809 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930622 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:31:48.937809 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930624 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:31:48.937809 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930627 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:31:48.937809 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930631 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:31:48.937809 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930634 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:31:48.937809 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930637 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:31:48.937809 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930639 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:31:48.937809 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930642 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:31:48.937809 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930644 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:31:48.937809 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930647 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:31:48.937809 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930649 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:31:48.937809 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930652 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:31:48.937809 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930655 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:31:48.937809 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930658 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:31:48.937809 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930660 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:31:48.938310 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930663 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:31:48.938310 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930666 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:31:48.938310 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930668 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:31:48.938310 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930671 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:31:48.938310 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930674 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:31:48.938310 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930678 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:31:48.938310 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930682 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:31:48.938310 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930684 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:31:48.938310 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930687 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:31:48.938310 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930689 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:31:48.938310 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930692 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:31:48.938310 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930694 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:31:48.938310 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930697 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:31:48.938310 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930700 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:31:48.938310 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930702 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:31:48.938310 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930705 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:31:48.938310 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930707 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:31:48.938310 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930710 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:31:48.938756 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930714 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:31:48.938756 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930717 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:31:48.938756 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930721 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:31:48.938756 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930725 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:31:48.938756 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930728 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:31:48.938756 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930731 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:31:48.938756 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930733 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:31:48.938756 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930736 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:31:48.938756 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930738 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:31:48.938756 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930741 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:31:48.938756 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930743 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:31:48.938756 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930746 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:31:48.938756 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930748 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:31:48.938756 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930751 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:31:48.938756 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930753 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:31:48.938756 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930756 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:31:48.938756 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930758 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:31:48.938756 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930761 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:31:48.938756 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930764 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:31:48.939230 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930767 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:31:48.939230 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930769 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:31:48.939230 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930772 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:31:48.939230 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.930774 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:31:48.939230 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.930782 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 13:31:48.939230 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.938510 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 13:31:48.939230 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.938528 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 13:31:48.939230 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938575 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:31:48.939230 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938580 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:31:48.939230 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938584 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:31:48.939230 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938588 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:31:48.939230 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938591 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:31:48.939230 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938594 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:31:48.939230 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938596 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:31:48.939230 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938599 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:31:48.939230 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938602 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:31:48.939703 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938604 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:31:48.939703 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938621 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:31:48.939703 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938625 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:31:48.939703 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938628 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:31:48.939703 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938631 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:31:48.939703 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938634 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:31:48.939703 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938637 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:31:48.939703 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938639 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:31:48.939703 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938642 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:31:48.939703 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938645 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:31:48.939703 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938648 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:31:48.939703 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938650 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:31:48.939703 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938653 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:31:48.939703 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938656 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:31:48.939703 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938658 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:31:48.939703 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938661 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:31:48.939703 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938663 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:31:48.939703 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938665 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:31:48.939703 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938668 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:31:48.939703 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938671 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:31:48.940228 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938673 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:31:48.940228 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938676 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:31:48.940228 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938680 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:31:48.940228 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938683 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:31:48.940228 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938686 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:31:48.940228 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938688 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:31:48.940228 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938691 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:31:48.940228 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938693 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:31:48.940228 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938696 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:31:48.940228 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938698 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:31:48.940228 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938701 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:31:48.940228 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938703 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:31:48.940228 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938706 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:31:48.940228 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938709 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:31:48.940228 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938711 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:31:48.940228 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938714 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:31:48.940228 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938716 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:31:48.940228 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938719 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:31:48.940228 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938722 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:31:48.940228 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938724 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:31:48.940744 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938727 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:31:48.940744 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938731 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:31:48.940744 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938735 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:31:48.940744 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938738 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:31:48.940744 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938741 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:31:48.940744 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938744 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:31:48.940744 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938747 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:31:48.940744 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938749 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:31:48.940744 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938752 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:31:48.940744 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938754 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:31:48.940744 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938757 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:31:48.940744 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938760 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:31:48.940744 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938762 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:31:48.940744 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938765 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:31:48.940744 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938769 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:31:48.940744 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938773 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:31:48.940744 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938778 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:31:48.940744 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938781 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:31:48.940744 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938783 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:31:48.941238 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938786 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:31:48.941238 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938789 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:31:48.941238 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938791 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:31:48.941238 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938794 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:31:48.941238 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938797 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:31:48.941238 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938799 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:31:48.941238 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938802 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:31:48.941238 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938805 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:31:48.941238 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938807 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:31:48.941238 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938810 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:31:48.941238 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938813 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:31:48.941238 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938816 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:31:48.941238 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938818 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:31:48.941238 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938821 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:31:48.941238 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938824 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:31:48.941238 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938826 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:31:48.941238 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938829 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:31:48.941238 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938831 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:31:48.941693 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.938836 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 13:31:48.941693 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938932 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:31:48.941693 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938937 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:31:48.941693 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938940 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:31:48.941693 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938943 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:31:48.941693 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938946 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:31:48.941693 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938949 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:31:48.941693 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938951 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:31:48.941693 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938954 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:31:48.941693 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938957 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:31:48.941693 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938960 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:31:48.941693 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938963 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:31:48.941693 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938966 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:31:48.941693 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938969 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:31:48.941693 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938972 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:31:48.942066 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938974 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:31:48.942066 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938977 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:31:48.942066 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938979 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:31:48.942066 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938982 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:31:48.942066 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938984 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:31:48.942066 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938987 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:31:48.942066 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938989 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:31:48.942066 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938992 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:31:48.942066 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938995 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:31:48.942066 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.938997 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:31:48.942066 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939000 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:31:48.942066 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939002 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:31:48.942066 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939005 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:31:48.942066 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939007 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:31:48.942066 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939010 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:31:48.942066 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939012 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:31:48.942066 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939015 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:31:48.942066 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939017 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:31:48.942066 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939020 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:31:48.942066 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939022 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:31:48.942066 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939025 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:31:48.942625 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939027 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:31:48.942625 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939029 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:31:48.942625 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939032 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:31:48.942625 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939035 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:31:48.942625 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939037 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:31:48.942625 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939040 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:31:48.942625 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939043 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:31:48.942625 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939046 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:31:48.942625 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939049 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:31:48.942625 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939052 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:31:48.942625 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939054 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:31:48.942625 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939057 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:31:48.942625 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939059 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:31:48.942625 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939062 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:31:48.942625 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939065 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:31:48.942625 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939067 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:31:48.942625 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939070 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:31:48.942625 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939072 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:31:48.942625 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939075 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:31:48.942625 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939077 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:31:48.943112 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939080 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:31:48.943112 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939082 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:31:48.943112 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939084 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:31:48.943112 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939087 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:31:48.943112 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939091 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:31:48.943112 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939094 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:31:48.943112 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939097 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:31:48.943112 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939099 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:31:48.943112 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939102 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:31:48.943112 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939104 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:31:48.943112 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939107 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:31:48.943112 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939109 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:31:48.943112 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939113 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:31:48.943112 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939117 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:31:48.943112 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939119 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:31:48.943112 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939122 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:31:48.943112 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939125 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:31:48.943112 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939128 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:31:48.943112 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939131 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:31:48.943703 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939134 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:31:48.943703 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939136 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:31:48.943703 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939139 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:31:48.943703 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939142 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:31:48.943703 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939145 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:31:48.943703 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939147 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:31:48.943703 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939150 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:31:48.943703 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939152 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:31:48.943703 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939155 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:31:48.943703 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939157 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:31:48.943703 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939159 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:31:48.943703 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:48.939162 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:31:48.943703 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.939167 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 13:31:48.943703 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.940545 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 13:31:48.943703 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.943592 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 13:31:48.945742 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.945730 2576 server.go:1019] "Starting client certificate rotation" Apr 23 13:31:48.945840 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.945824 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 13:31:48.945878 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.945857 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 13:31:48.974400 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.974382 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 13:31:48.980489 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.980457 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 13:31:48.995940 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:48.995919 2576 log.go:25] "Validated CRI v1 runtime API" Apr 23 13:31:49.003451 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.003426 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 13:31:49.003451 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.003448 2576 log.go:25] "Validated CRI v1 image API" Apr 23 13:31:49.005302 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.005287 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 13:31:49.012984 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.012956 2576 fs.go:135] Filesystem UUIDs: map[18b9472d-2f50-44dc-aa6c-3aa2b8ba21fe:/dev/nvme0n1p3 6f7c1984-08a6-4f13-851f-8241232688c2:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 23 13:31:49.013057 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.012983 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 13:31:49.019607 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.019498 2576 manager.go:217] Machine: {Timestamp:2026-04-23 13:31:49.017362673 +0000 UTC m=+0.475821942 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100935 MemoryCapacity:33164480512 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2fc6077dfb2bc0fed401870a13c008 SystemUUID:ec2fc607-7dfb-2bc0-fed4-01870a13c008 BootID:2dfb0e83-f02b-4760-8e6e-a9a100430319 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582238208 Type:vfs Inodes:4048398 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:25:78:45:17:49 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:25:78:45:17:49 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:7a:97:7e:d1:bf:11 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164480512 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 13:31:49.019607 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.019602 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 13:31:49.019710 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.019682 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 13:31:49.020882 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.020857 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 13:31:49.021026 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.020884 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-134-22.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 13:31:49.021068 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.021036 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 13:31:49.021068 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.021045 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 13:31:49.021068 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.021058 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 13:31:49.021909 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.021899 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 13:31:49.023828 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.023818 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 23 13:31:49.023946 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.023936 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 13:31:49.026656 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.026647 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 23 13:31:49.026688 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.026663 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 13:31:49.026688 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.026676 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 13:31:49.026688 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.026685 2576 kubelet.go:397] "Adding apiserver pod source" Apr 23 13:31:49.026813 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.026696 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 13:31:49.028123 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.028109 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 13:31:49.028162 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.028132 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 13:31:49.031640 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.031626 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 13:31:49.033140 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.033127 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 13:31:49.035219 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.035208 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 13:31:49.035271 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.035224 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 13:31:49.035271 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.035231 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 13:31:49.035271 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.035236 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 13:31:49.035271 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.035242 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 13:31:49.035271 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.035248 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 13:31:49.035271 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.035254 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 13:31:49.035271 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.035260 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 13:31:49.035271 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.035268 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 13:31:49.035271 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.035274 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 13:31:49.035524 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.035283 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 13:31:49.035524 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.035293 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 13:31:49.036303 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.036291 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 13:31:49.036303 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.036301 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 13:31:49.042780 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.042752 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-22.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 13:31:49.042878 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:49.042802 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-134-22.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 13:31:49.042923 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:49.042903 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 13:31:49.043507 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.043492 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 13:31:49.043541 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.043533 2576 server.go:1295] "Started kubelet" Apr 23 13:31:49.043673 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.043624 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 13:31:49.043758 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.043700 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 13:31:49.043802 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.043765 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 13:31:49.044263 ip-10-0-134-22 systemd[1]: Started Kubernetes Kubelet. Apr 23 13:31:49.045224 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.045208 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 13:31:49.045637 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.045621 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-tbsgc" Apr 23 13:31:49.046700 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.046687 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 23 13:31:49.051213 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:49.049952 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-22.ec2.internal.18a8ff9e75c64f0f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-22.ec2.internal,UID:ip-10-0-134-22.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-134-22.ec2.internal,},FirstTimestamp:2026-04-23 13:31:49.043506959 +0000 UTC m=+0.501966243,LastTimestamp:2026-04-23 13:31:49.043506959 +0000 UTC m=+0.501966243,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-22.ec2.internal,}" Apr 23 13:31:49.052964 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.052817 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-tbsgc" Apr 23 13:31:49.053970 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.053952 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 13:31:49.053970 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.053957 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 13:31:49.056294 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.054708 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 13:31:49.056294 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.054729 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 13:31:49.056294 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.054766 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 13:31:49.056294 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.054782 2576 factory.go:55] Registering systemd factory Apr 23 13:31:49.056294 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.054793 2576 factory.go:223] Registration of the systemd container factory successfully Apr 23 13:31:49.056294 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.054869 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 13:31:49.056294 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.054933 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 23 13:31:49.056294 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.054940 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 23 13:31:49.056294 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.055098 2576 factory.go:153] Registering CRI-O factory Apr 23 13:31:49.056294 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.055110 2576 factory.go:223] Registration of the crio container factory successfully Apr 23 13:31:49.056294 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.055133 2576 factory.go:103] Registering Raw factory Apr 23 13:31:49.056294 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.055146 2576 manager.go:1196] Started watching for new ooms in manager Apr 23 13:31:49.056294 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.055707 2576 manager.go:319] Starting recovery of all containers Apr 23 13:31:49.057310 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:49.057281 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 23 13:31:49.057453 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:49.057434 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-22.ec2.internal\" not found" Apr 23 13:31:49.066343 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.066225 2576 manager.go:324] Recovery completed Apr 23 13:31:49.070629 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.070612 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:31:49.071676 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.071659 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:31:49.073497 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:49.073476 2576 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-134-22.ec2.internal\" not found" node="ip-10-0-134-22.ec2.internal" Apr 23 13:31:49.078197 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.078184 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:31:49.078242 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.078212 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:31:49.078242 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.078227 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:31:49.078714 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.078700 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 13:31:49.078714 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.078713 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 13:31:49.078854 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.078733 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 23 13:31:49.081846 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.081831 2576 policy_none.go:49] "None policy: Start" Apr 23 13:31:49.081927 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.081852 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 13:31:49.081927 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.081865 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 23 13:31:49.120997 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.120980 2576 manager.go:341] "Starting Device Plugin manager" Apr 23 13:31:49.122600 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:49.121016 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 13:31:49.122600 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.121028 2576 server.go:85] "Starting device plugin registration server" Apr 23 13:31:49.122600 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.121257 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 13:31:49.122600 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.121269 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 13:31:49.122600 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.121352 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 13:31:49.122600 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.121433 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 13:31:49.122600 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.121440 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 13:31:49.122600 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:49.121951 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 13:31:49.122600 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:49.121981 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-134-22.ec2.internal\" not found" Apr 23 13:31:49.153595 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.153566 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 13:31:49.154942 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.154916 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 13:31:49.155041 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.154948 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 13:31:49.155041 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.154970 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 13:31:49.155041 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.154979 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 13:31:49.155041 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:49.155018 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 13:31:49.158218 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.158196 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:31:49.222396 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.222325 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:31:49.223445 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.223430 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:31:49.223516 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.223460 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:31:49.223516 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.223470 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:31:49.223516 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.223492 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-134-22.ec2.internal" Apr 23 13:31:49.233054 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.233039 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-134-22.ec2.internal" Apr 23 13:31:49.233107 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:49.233061 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-134-22.ec2.internal\": node \"ip-10-0-134-22.ec2.internal\" not found" Apr 23 13:31:49.248321 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:49.248300 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-22.ec2.internal\" not found" Apr 23 13:31:49.255452 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.255406 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-134-22.ec2.internal"] Apr 23 13:31:49.255507 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.255492 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:31:49.256296 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.256280 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:31:49.256364 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.256312 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:31:49.256364 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.256327 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:31:49.257449 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.257437 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:31:49.257590 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.257575 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal" Apr 23 13:31:49.257641 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.257605 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:31:49.259298 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.259272 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:31:49.259298 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.259296 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:31:49.259479 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.259318 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:31:49.259479 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.259328 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:31:49.259479 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.259302 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:31:49.259479 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.259383 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:31:49.260563 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.260550 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-22.ec2.internal" Apr 23 13:31:49.260609 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.260574 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:31:49.261227 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.261214 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:31:49.261279 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.261238 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:31:49.261279 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.261247 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:31:49.274925 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:49.274908 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-22.ec2.internal\" not found" node="ip-10-0-134-22.ec2.internal" Apr 23 13:31:49.278748 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:49.278729 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-22.ec2.internal\" not found" node="ip-10-0-134-22.ec2.internal" Apr 23 13:31:49.349164 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:49.349138 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-22.ec2.internal\" not found" Apr 23 13:31:49.449874 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:49.449844 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-22.ec2.internal\" not found" Apr 23 13:31:49.456171 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.456140 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f546dccbfe88d958c8bad79dd015e11c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal\" (UID: \"f546dccbfe88d958c8bad79dd015e11c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal" Apr 23 13:31:49.456171 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.456170 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f546dccbfe88d958c8bad79dd015e11c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal\" (UID: \"f546dccbfe88d958c8bad79dd015e11c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal" Apr 23 13:31:49.456341 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.456189 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1b90ee820fd4186f1e6cd40d24ef3276-config\") pod \"kube-apiserver-proxy-ip-10-0-134-22.ec2.internal\" (UID: \"1b90ee820fd4186f1e6cd40d24ef3276\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-22.ec2.internal" Apr 23 13:31:49.550598 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:49.550527 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-22.ec2.internal\" not found" Apr 23 13:31:49.556876 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.556853 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f546dccbfe88d958c8bad79dd015e11c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal\" (UID: \"f546dccbfe88d958c8bad79dd015e11c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal" Apr 23 13:31:49.556931 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.556883 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1b90ee820fd4186f1e6cd40d24ef3276-config\") pod \"kube-apiserver-proxy-ip-10-0-134-22.ec2.internal\" (UID: \"1b90ee820fd4186f1e6cd40d24ef3276\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-22.ec2.internal" Apr 23 13:31:49.556931 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.556903 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f546dccbfe88d958c8bad79dd015e11c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal\" (UID: \"f546dccbfe88d958c8bad79dd015e11c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal" Apr 23 13:31:49.556996 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.556955 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f546dccbfe88d958c8bad79dd015e11c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal\" (UID: \"f546dccbfe88d958c8bad79dd015e11c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal" Apr 23 13:31:49.556996 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.556959 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1b90ee820fd4186f1e6cd40d24ef3276-config\") pod \"kube-apiserver-proxy-ip-10-0-134-22.ec2.internal\" (UID: \"1b90ee820fd4186f1e6cd40d24ef3276\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-22.ec2.internal" Apr 23 13:31:49.556996 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.556963 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f546dccbfe88d958c8bad79dd015e11c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal\" (UID: \"f546dccbfe88d958c8bad79dd015e11c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal" Apr 23 13:31:49.575977 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.575959 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal" Apr 23 13:31:49.581369 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.581346 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-22.ec2.internal" Apr 23 13:31:49.650961 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:49.650933 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-22.ec2.internal\" not found" Apr 23 13:31:49.751452 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:49.751423 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-22.ec2.internal\" not found" Apr 23 13:31:49.852025 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:49.851942 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-22.ec2.internal\" not found" Apr 23 13:31:49.944448 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.944406 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 13:31:49.944992 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.944572 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 13:31:49.944992 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:49.944585 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 13:31:49.952534 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:49.952517 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-22.ec2.internal\" not found" Apr 23 13:31:50.053023 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:50.052992 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-22.ec2.internal\" not found" Apr 23 13:31:50.054086 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:50.054068 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 13:31:50.056031 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:50.055998 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 13:26:49 +0000 UTC" deadline="2027-12-28 19:43:28.043647025 +0000 UTC" Apr 23 13:31:50.056085 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:50.056031 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14742h11m37.987618613s" Apr 23 13:31:50.066173 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:50.066149 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 13:31:50.079700 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:50.079652 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b90ee820fd4186f1e6cd40d24ef3276.slice/crio-77085c624d0f153b54b6694c5fd0ab8b5fec0722dc1566591adb89cd47df73c4 WatchSource:0}: Error finding container 77085c624d0f153b54b6694c5fd0ab8b5fec0722dc1566591adb89cd47df73c4: Status 404 returned error can't find the container with id 77085c624d0f153b54b6694c5fd0ab8b5fec0722dc1566591adb89cd47df73c4 Apr 23 13:31:50.080014 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:50.079999 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf546dccbfe88d958c8bad79dd015e11c.slice/crio-9ff6e0860fba8f66efa91b7f698173040a993a0eceffb8ade5dbfe4faea47627 WatchSource:0}: Error finding container 9ff6e0860fba8f66efa91b7f698173040a993a0eceffb8ade5dbfe4faea47627: Status 404 returned error can't find the container with id 9ff6e0860fba8f66efa91b7f698173040a993a0eceffb8ade5dbfe4faea47627 Apr 23 13:31:50.085025 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:50.085011 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 13:31:50.107660 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:50.107608 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-mft4s" Apr 23 13:31:50.116454 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:50.116426 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-mft4s" Apr 23 13:31:50.153910 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:50.153893 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-22.ec2.internal\" not found" Apr 23 13:31:50.157891 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:50.157850 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal" event={"ID":"f546dccbfe88d958c8bad79dd015e11c","Type":"ContainerStarted","Data":"9ff6e0860fba8f66efa91b7f698173040a993a0eceffb8ade5dbfe4faea47627"} Apr 23 13:31:50.158882 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:50.158860 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-22.ec2.internal" event={"ID":"1b90ee820fd4186f1e6cd40d24ef3276","Type":"ContainerStarted","Data":"77085c624d0f153b54b6694c5fd0ab8b5fec0722dc1566591adb89cd47df73c4"} Apr 23 13:31:50.240954 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:50.240926 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:31:50.254782 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:50.254760 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal" Apr 23 13:31:50.269277 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:50.269262 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 13:31:50.271202 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:50.271190 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-22.ec2.internal" Apr 23 13:31:50.279519 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:50.279505 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 13:31:50.284780 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:50.284766 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:31:50.787213 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:50.787183 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:31:51.027661 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.027630 2576 apiserver.go:52] "Watching apiserver" Apr 23 13:31:51.037373 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.037300 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 13:31:51.038996 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.038969 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-66br2","openshift-ovn-kubernetes/ovnkube-node-72hmc","kube-system/global-pull-secret-syncer-kjxcs","kube-system/konnectivity-agent-62dc8","kube-system/kube-apiserver-proxy-ip-10-0-134-22.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-glcfn","openshift-cluster-node-tuning-operator/tuned-zzbbp","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal","openshift-multus/multus-additional-cni-plugins-57l24","openshift-multus/network-metrics-daemon-564gb","openshift-dns/node-resolver-cj9dc","openshift-image-registry/node-ca-jfnjq","openshift-multus/multus-275jz","openshift-network-diagnostics/network-check-target-dggd8"] Apr 23 13:31:51.041041 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.040982 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-zzbbp" Apr 23 13:31:51.041979 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.041950 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-cj9dc" Apr 23 13:31:51.042065 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.042041 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kjxcs" Apr 23 13:31:51.042145 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:51.042118 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kjxcs" podUID="2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515" Apr 23 13:31:51.043430 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.043389 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 13:31:51.043524 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.043501 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 13:31:51.043524 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.043500 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-sr2g2\"" Apr 23 13:31:51.043826 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.043810 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-vk77b\"" Apr 23 13:31:51.044065 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.044043 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 13:31:51.044158 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.044095 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 13:31:51.044276 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.044256 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-62dc8" Apr 23 13:31:51.045646 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.045630 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dggd8" Apr 23 13:31:51.045740 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:51.045695 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dggd8" podUID="ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e" Apr 23 13:31:51.045807 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.045779 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-glcfn" Apr 23 13:31:51.046554 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.046358 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 13:31:51.046554 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.046391 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 13:31:51.046554 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.046455 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-dswd5\"" Apr 23 13:31:51.046758 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.046692 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-57l24" Apr 23 13:31:51.047547 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.047525 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 13:31:51.047639 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.047603 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 13:31:51.047808 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.047790 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-275jz" Apr 23 13:31:51.048056 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.048037 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 13:31:51.049485 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.049028 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.049714 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.049666 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 13:31:51.049714 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.049705 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-pcrnb\"" Apr 23 13:31:51.049934 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.049916 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 13:31:51.049934 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.049931 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-zvfbj\"" Apr 23 13:31:51.050078 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.050023 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 13:31:51.050130 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.050085 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 13:31:51.050429 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.050400 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 13:31:51.050619 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.050406 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-564gb" Apr 23 13:31:51.050707 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:51.050687 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-564gb" podUID="0dcb0e48-f774-49cd-8b04-58a5050a5ff2" Apr 23 13:31:51.051114 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.051095 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-8v757\"" Apr 23 13:31:51.051114 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.051105 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 13:31:51.051252 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.051114 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 13:31:51.051567 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.051549 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 13:31:51.051664 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.051628 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 13:31:51.051861 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.051834 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-66br2" Apr 23 13:31:51.052203 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.052055 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 13:31:51.052203 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.052153 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 13:31:51.052361 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.052344 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-8cjws\"" Apr 23 13:31:51.052785 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.052768 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 13:31:51.053246 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.053222 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jfnjq" Apr 23 13:31:51.055051 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.054990 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 13:31:51.055796 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.055278 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 13:31:51.055796 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.055373 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 13:31:51.055796 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.055499 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 13:31:51.055796 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.055633 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 13:31:51.055796 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.055737 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-58r5z\"" Apr 23 13:31:51.056095 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.055859 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 13:31:51.056095 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.055903 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-jcpkp\"" Apr 23 13:31:51.056095 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.056057 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 13:31:51.063759 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.063732 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg9z9\" (UniqueName: \"kubernetes.io/projected/0dcb0e48-f774-49cd-8b04-58a5050a5ff2-kube-api-access-qg9z9\") pod \"network-metrics-daemon-564gb\" (UID: \"0dcb0e48-f774-49cd-8b04-58a5050a5ff2\") " pod="openshift-multus/network-metrics-daemon-564gb" Apr 23 13:31:51.063861 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.063772 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d9ada073-20d9-454e-b803-aef6be2e17c7-host\") pod \"tuned-zzbbp\" (UID: \"d9ada073-20d9-454e-b803-aef6be2e17c7\") " pod="openshift-cluster-node-tuning-operator/tuned-zzbbp" Apr 23 13:31:51.063861 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.063837 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/55a0340a-0400-482b-8422-3e0465f0802d-multus-cni-dir\") pod \"multus-275jz\" (UID: \"55a0340a-0400-482b-8422-3e0465f0802d\") " pod="openshift-multus/multus-275jz" Apr 23 13:31:51.063970 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.063874 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/edb014da-0558-4a2a-9f98-bea52a2c723e-host-slash\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.063970 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.063895 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/edb014da-0558-4a2a-9f98-bea52a2c723e-etc-openvswitch\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.063970 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.063911 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b7e52e35-9f8f-43be-b9d9-69181afa13ed-kubelet-dir\") pod \"aws-ebs-csi-driver-node-glcfn\" (UID: \"b7e52e35-9f8f-43be-b9d9-69181afa13ed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-glcfn" Apr 23 13:31:51.063970 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.063930 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b7e52e35-9f8f-43be-b9d9-69181afa13ed-sys-fs\") pod \"aws-ebs-csi-driver-node-glcfn\" (UID: \"b7e52e35-9f8f-43be-b9d9-69181afa13ed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-glcfn" Apr 23 13:31:51.063970 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.063952 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/098e4208-e230-428a-af72-f1aa64c09ce0-serviceca\") pod \"node-ca-jfnjq\" (UID: \"098e4208-e230-428a-af72-f1aa64c09ce0\") " pod="openshift-image-registry/node-ca-jfnjq" Apr 23 13:31:51.064175 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.063973 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d9ada073-20d9-454e-b803-aef6be2e17c7-sys\") pod \"tuned-zzbbp\" (UID: \"d9ada073-20d9-454e-b803-aef6be2e17c7\") " pod="openshift-cluster-node-tuning-operator/tuned-zzbbp" Apr 23 13:31:51.064175 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.063997 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515-original-pull-secret\") pod \"global-pull-secret-syncer-kjxcs\" (UID: \"2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515\") " pod="kube-system/global-pull-secret-syncer-kjxcs" Apr 23 13:31:51.064175 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.064020 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/55a0340a-0400-482b-8422-3e0465f0802d-host-run-netns\") pod \"multus-275jz\" (UID: \"55a0340a-0400-482b-8422-3e0465f0802d\") " pod="openshift-multus/multus-275jz" Apr 23 13:31:51.064175 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.064042 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8ddfabc2-1040-4841-9473-ed5ba1c0c775-hosts-file\") pod \"node-resolver-cj9dc\" (UID: \"8ddfabc2-1040-4841-9473-ed5ba1c0c775\") " pod="openshift-dns/node-resolver-cj9dc" Apr 23 13:31:51.064175 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.064085 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/edb014da-0558-4a2a-9f98-bea52a2c723e-host-kubelet\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.064175 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.064124 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzwvw\" (UniqueName: \"kubernetes.io/projected/edb014da-0558-4a2a-9f98-bea52a2c723e-kube-api-access-qzwvw\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.064175 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.064150 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b7e52e35-9f8f-43be-b9d9-69181afa13ed-device-dir\") pod \"aws-ebs-csi-driver-node-glcfn\" (UID: \"b7e52e35-9f8f-43be-b9d9-69181afa13ed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-glcfn" Apr 23 13:31:51.064519 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.064202 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/55a0340a-0400-482b-8422-3e0465f0802d-cnibin\") pod \"multus-275jz\" (UID: \"55a0340a-0400-482b-8422-3e0465f0802d\") " pod="openshift-multus/multus-275jz" Apr 23 13:31:51.064519 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.064262 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/edb014da-0558-4a2a-9f98-bea52a2c723e-host-run-netns\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.064519 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.064289 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd-cnibin\") pod \"multus-additional-cni-plugins-57l24\" (UID: \"dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd\") " pod="openshift-multus/multus-additional-cni-plugins-57l24" Apr 23 13:31:51.064519 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.064334 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-57l24\" (UID: \"dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd\") " pod="openshift-multus/multus-additional-cni-plugins-57l24" Apr 23 13:31:51.064519 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.064363 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d9ada073-20d9-454e-b803-aef6be2e17c7-var-lib-kubelet\") pod \"tuned-zzbbp\" (UID: \"d9ada073-20d9-454e-b803-aef6be2e17c7\") " pod="openshift-cluster-node-tuning-operator/tuned-zzbbp" Apr 23 13:31:51.064519 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.064389 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wl5m7\" (UniqueName: \"kubernetes.io/projected/d9ada073-20d9-454e-b803-aef6be2e17c7-kube-api-access-wl5m7\") pod \"tuned-zzbbp\" (UID: \"d9ada073-20d9-454e-b803-aef6be2e17c7\") " pod="openshift-cluster-node-tuning-operator/tuned-zzbbp" Apr 23 13:31:51.064519 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.064428 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0dcb0e48-f774-49cd-8b04-58a5050a5ff2-metrics-certs\") pod \"network-metrics-daemon-564gb\" (UID: \"0dcb0e48-f774-49cd-8b04-58a5050a5ff2\") " pod="openshift-multus/network-metrics-daemon-564gb" Apr 23 13:31:51.064519 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.064468 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d9ada073-20d9-454e-b803-aef6be2e17c7-run\") pod \"tuned-zzbbp\" (UID: \"d9ada073-20d9-454e-b803-aef6be2e17c7\") " pod="openshift-cluster-node-tuning-operator/tuned-zzbbp" Apr 23 13:31:51.064519 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.064505 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hblj\" (UniqueName: \"kubernetes.io/projected/ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e-kube-api-access-5hblj\") pod \"network-check-target-dggd8\" (UID: \"ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e\") " pod="openshift-network-diagnostics/network-check-target-dggd8" Apr 23 13:31:51.064971 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.064533 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/edb014da-0558-4a2a-9f98-bea52a2c723e-node-log\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.064971 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.064568 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/edb014da-0558-4a2a-9f98-bea52a2c723e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.064971 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.064592 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d9ada073-20d9-454e-b803-aef6be2e17c7-tmp\") pod \"tuned-zzbbp\" (UID: \"d9ada073-20d9-454e-b803-aef6be2e17c7\") " pod="openshift-cluster-node-tuning-operator/tuned-zzbbp" Apr 23 13:31:51.064971 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.064614 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/55a0340a-0400-482b-8422-3e0465f0802d-system-cni-dir\") pod \"multus-275jz\" (UID: \"55a0340a-0400-482b-8422-3e0465f0802d\") " pod="openshift-multus/multus-275jz" Apr 23 13:31:51.064971 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.064640 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/55a0340a-0400-482b-8422-3e0465f0802d-cni-binary-copy\") pod \"multus-275jz\" (UID: \"55a0340a-0400-482b-8422-3e0465f0802d\") " pod="openshift-multus/multus-275jz" Apr 23 13:31:51.064971 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.064669 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/edb014da-0558-4a2a-9f98-bea52a2c723e-systemd-units\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.064971 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.064693 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/324be42f-87e9-413c-a39b-1c5ebac3ad6d-host-slash\") pod \"iptables-alerter-66br2\" (UID: \"324be42f-87e9-413c-a39b-1c5ebac3ad6d\") " pod="openshift-network-operator/iptables-alerter-66br2" Apr 23 13:31:51.064971 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.064718 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515-kubelet-config\") pod \"global-pull-secret-syncer-kjxcs\" (UID: \"2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515\") " pod="kube-system/global-pull-secret-syncer-kjxcs" Apr 23 13:31:51.064971 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.064758 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/55a0340a-0400-482b-8422-3e0465f0802d-os-release\") pod \"multus-275jz\" (UID: \"55a0340a-0400-482b-8422-3e0465f0802d\") " pod="openshift-multus/multus-275jz" Apr 23 13:31:51.064971 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.064782 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/55a0340a-0400-482b-8422-3e0465f0802d-hostroot\") pod \"multus-275jz\" (UID: \"55a0340a-0400-482b-8422-3e0465f0802d\") " pod="openshift-multus/multus-275jz" Apr 23 13:31:51.064971 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.064803 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/edb014da-0558-4a2a-9f98-bea52a2c723e-host-cni-bin\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.064971 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.064824 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/349fe628-9a5e-4f45-bd89-4d75157af516-konnectivity-ca\") pod \"konnectivity-agent-62dc8\" (UID: \"349fe628-9a5e-4f45-bd89-4d75157af516\") " pod="kube-system/konnectivity-agent-62dc8" Apr 23 13:31:51.064971 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.064847 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb745\" (UniqueName: \"kubernetes.io/projected/324be42f-87e9-413c-a39b-1c5ebac3ad6d-kube-api-access-zb745\") pod \"iptables-alerter-66br2\" (UID: \"324be42f-87e9-413c-a39b-1c5ebac3ad6d\") " pod="openshift-network-operator/iptables-alerter-66br2" Apr 23 13:31:51.064971 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.064871 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd-system-cni-dir\") pod \"multus-additional-cni-plugins-57l24\" (UID: \"dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd\") " pod="openshift-multus/multus-additional-cni-plugins-57l24" Apr 23 13:31:51.064971 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.064896 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd-cni-binary-copy\") pod \"multus-additional-cni-plugins-57l24\" (UID: \"dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd\") " pod="openshift-multus/multus-additional-cni-plugins-57l24" Apr 23 13:31:51.064971 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.064938 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/55a0340a-0400-482b-8422-3e0465f0802d-host-var-lib-kubelet\") pod \"multus-275jz\" (UID: \"55a0340a-0400-482b-8422-3e0465f0802d\") " pod="openshift-multus/multus-275jz" Apr 23 13:31:51.065742 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.064980 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/edb014da-0558-4a2a-9f98-bea52a2c723e-run-ovn\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.065742 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.065003 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/edb014da-0558-4a2a-9f98-bea52a2c723e-host-run-ovn-kubernetes\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.065742 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.065024 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8ddfabc2-1040-4841-9473-ed5ba1c0c775-tmp-dir\") pod \"node-resolver-cj9dc\" (UID: \"8ddfabc2-1040-4841-9473-ed5ba1c0c775\") " pod="openshift-dns/node-resolver-cj9dc" Apr 23 13:31:51.065742 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.065053 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b7e52e35-9f8f-43be-b9d9-69181afa13ed-etc-selinux\") pod \"aws-ebs-csi-driver-node-glcfn\" (UID: \"b7e52e35-9f8f-43be-b9d9-69181afa13ed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-glcfn" Apr 23 13:31:51.065742 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.065068 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d9ada073-20d9-454e-b803-aef6be2e17c7-etc-systemd\") pod \"tuned-zzbbp\" (UID: \"d9ada073-20d9-454e-b803-aef6be2e17c7\") " pod="openshift-cluster-node-tuning-operator/tuned-zzbbp" Apr 23 13:31:51.065742 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.065106 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-57l24\" (UID: \"dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd\") " pod="openshift-multus/multus-additional-cni-plugins-57l24" Apr 23 13:31:51.065742 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.065136 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/55a0340a-0400-482b-8422-3e0465f0802d-multus-daemon-config\") pod \"multus-275jz\" (UID: \"55a0340a-0400-482b-8422-3e0465f0802d\") " pod="openshift-multus/multus-275jz" Apr 23 13:31:51.065742 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.065159 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/edb014da-0558-4a2a-9f98-bea52a2c723e-log-socket\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.065742 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.065191 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/349fe628-9a5e-4f45-bd89-4d75157af516-agent-certs\") pod \"konnectivity-agent-62dc8\" (UID: \"349fe628-9a5e-4f45-bd89-4d75157af516\") " pod="kube-system/konnectivity-agent-62dc8" Apr 23 13:31:51.065742 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.065213 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dxxk\" (UniqueName: \"kubernetes.io/projected/b7e52e35-9f8f-43be-b9d9-69181afa13ed-kube-api-access-5dxxk\") pod \"aws-ebs-csi-driver-node-glcfn\" (UID: \"b7e52e35-9f8f-43be-b9d9-69181afa13ed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-glcfn" Apr 23 13:31:51.065742 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.065245 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b7e52e35-9f8f-43be-b9d9-69181afa13ed-registration-dir\") pod \"aws-ebs-csi-driver-node-glcfn\" (UID: \"b7e52e35-9f8f-43be-b9d9-69181afa13ed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-glcfn" Apr 23 13:31:51.065742 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.065269 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/324be42f-87e9-413c-a39b-1c5ebac3ad6d-iptables-alerter-script\") pod \"iptables-alerter-66br2\" (UID: \"324be42f-87e9-413c-a39b-1c5ebac3ad6d\") " pod="openshift-network-operator/iptables-alerter-66br2" Apr 23 13:31:51.065742 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.065292 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd-os-release\") pod \"multus-additional-cni-plugins-57l24\" (UID: \"dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd\") " pod="openshift-multus/multus-additional-cni-plugins-57l24" Apr 23 13:31:51.065742 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.065317 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/edb014da-0558-4a2a-9f98-bea52a2c723e-ovn-node-metrics-cert\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.065742 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.065341 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9gsd\" (UniqueName: \"kubernetes.io/projected/098e4208-e230-428a-af72-f1aa64c09ce0-kube-api-access-k9gsd\") pod \"node-ca-jfnjq\" (UID: \"098e4208-e230-428a-af72-f1aa64c09ce0\") " pod="openshift-image-registry/node-ca-jfnjq" Apr 23 13:31:51.065742 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.065368 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txm7v\" (UniqueName: \"kubernetes.io/projected/dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd-kube-api-access-txm7v\") pod \"multus-additional-cni-plugins-57l24\" (UID: \"dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd\") " pod="openshift-multus/multus-additional-cni-plugins-57l24" Apr 23 13:31:51.066612 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.065392 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7vlz\" (UniqueName: \"kubernetes.io/projected/55a0340a-0400-482b-8422-3e0465f0802d-kube-api-access-c7vlz\") pod \"multus-275jz\" (UID: \"55a0340a-0400-482b-8422-3e0465f0802d\") " pod="openshift-multus/multus-275jz" Apr 23 13:31:51.066612 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.065437 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d9ada073-20d9-454e-b803-aef6be2e17c7-etc-modprobe-d\") pod \"tuned-zzbbp\" (UID: \"d9ada073-20d9-454e-b803-aef6be2e17c7\") " pod="openshift-cluster-node-tuning-operator/tuned-zzbbp" Apr 23 13:31:51.066612 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.065461 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d9ada073-20d9-454e-b803-aef6be2e17c7-etc-sysconfig\") pod \"tuned-zzbbp\" (UID: \"d9ada073-20d9-454e-b803-aef6be2e17c7\") " pod="openshift-cluster-node-tuning-operator/tuned-zzbbp" Apr 23 13:31:51.066612 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.065512 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d9ada073-20d9-454e-b803-aef6be2e17c7-etc-tuned\") pod \"tuned-zzbbp\" (UID: \"d9ada073-20d9-454e-b803-aef6be2e17c7\") " pod="openshift-cluster-node-tuning-operator/tuned-zzbbp" Apr 23 13:31:51.066612 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.065543 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515-dbus\") pod \"global-pull-secret-syncer-kjxcs\" (UID: \"2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515\") " pod="kube-system/global-pull-secret-syncer-kjxcs" Apr 23 13:31:51.066612 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.065560 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/edb014da-0558-4a2a-9f98-bea52a2c723e-host-cni-netd\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.066612 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.065581 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/55a0340a-0400-482b-8422-3e0465f0802d-host-var-lib-cni-bin\") pod \"multus-275jz\" (UID: \"55a0340a-0400-482b-8422-3e0465f0802d\") " pod="openshift-multus/multus-275jz" Apr 23 13:31:51.066612 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.065606 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/edb014da-0558-4a2a-9f98-bea52a2c723e-run-openvswitch\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.066612 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.065629 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b7e52e35-9f8f-43be-b9d9-69181afa13ed-socket-dir\") pod \"aws-ebs-csi-driver-node-glcfn\" (UID: \"b7e52e35-9f8f-43be-b9d9-69181afa13ed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-glcfn" Apr 23 13:31:51.066612 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.065659 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/098e4208-e230-428a-af72-f1aa64c09ce0-host\") pod \"node-ca-jfnjq\" (UID: \"098e4208-e230-428a-af72-f1aa64c09ce0\") " pod="openshift-image-registry/node-ca-jfnjq" Apr 23 13:31:51.066612 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.065819 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd-tuning-conf-dir\") pod \"multus-additional-cni-plugins-57l24\" (UID: \"dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd\") " pod="openshift-multus/multus-additional-cni-plugins-57l24" Apr 23 13:31:51.066612 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.065869 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/55a0340a-0400-482b-8422-3e0465f0802d-host-var-lib-cni-multus\") pod \"multus-275jz\" (UID: \"55a0340a-0400-482b-8422-3e0465f0802d\") " pod="openshift-multus/multus-275jz" Apr 23 13:31:51.066612 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.065895 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/55a0340a-0400-482b-8422-3e0465f0802d-multus-conf-dir\") pod \"multus-275jz\" (UID: \"55a0340a-0400-482b-8422-3e0465f0802d\") " pod="openshift-multus/multus-275jz" Apr 23 13:31:51.066612 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.065940 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d9ada073-20d9-454e-b803-aef6be2e17c7-etc-sysctl-d\") pod \"tuned-zzbbp\" (UID: \"d9ada073-20d9-454e-b803-aef6be2e17c7\") " pod="openshift-cluster-node-tuning-operator/tuned-zzbbp" Apr 23 13:31:51.066612 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.065965 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d9ada073-20d9-454e-b803-aef6be2e17c7-etc-sysctl-conf\") pod \"tuned-zzbbp\" (UID: \"d9ada073-20d9-454e-b803-aef6be2e17c7\") " pod="openshift-cluster-node-tuning-operator/tuned-zzbbp" Apr 23 13:31:51.066612 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.065987 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d9ada073-20d9-454e-b803-aef6be2e17c7-lib-modules\") pod \"tuned-zzbbp\" (UID: \"d9ada073-20d9-454e-b803-aef6be2e17c7\") " pod="openshift-cluster-node-tuning-operator/tuned-zzbbp" Apr 23 13:31:51.066612 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.066009 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/55a0340a-0400-482b-8422-3e0465f0802d-host-run-multus-certs\") pod \"multus-275jz\" (UID: \"55a0340a-0400-482b-8422-3e0465f0802d\") " pod="openshift-multus/multus-275jz" Apr 23 13:31:51.067266 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.066055 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55a0340a-0400-482b-8422-3e0465f0802d-etc-kubernetes\") pod \"multus-275jz\" (UID: \"55a0340a-0400-482b-8422-3e0465f0802d\") " pod="openshift-multus/multus-275jz" Apr 23 13:31:51.067266 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.066081 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/edb014da-0558-4a2a-9f98-bea52a2c723e-run-systemd\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.067266 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.066105 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/edb014da-0558-4a2a-9f98-bea52a2c723e-ovnkube-script-lib\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.067266 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.066142 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9kns\" (UniqueName: \"kubernetes.io/projected/8ddfabc2-1040-4841-9473-ed5ba1c0c775-kube-api-access-w9kns\") pod \"node-resolver-cj9dc\" (UID: \"8ddfabc2-1040-4841-9473-ed5ba1c0c775\") " pod="openshift-dns/node-resolver-cj9dc" Apr 23 13:31:51.067266 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.066200 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9ada073-20d9-454e-b803-aef6be2e17c7-etc-kubernetes\") pod \"tuned-zzbbp\" (UID: \"d9ada073-20d9-454e-b803-aef6be2e17c7\") " pod="openshift-cluster-node-tuning-operator/tuned-zzbbp" Apr 23 13:31:51.067266 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.066222 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/55a0340a-0400-482b-8422-3e0465f0802d-multus-socket-dir-parent\") pod \"multus-275jz\" (UID: \"55a0340a-0400-482b-8422-3e0465f0802d\") " pod="openshift-multus/multus-275jz" Apr 23 13:31:51.067266 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.066258 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/55a0340a-0400-482b-8422-3e0465f0802d-host-run-k8s-cni-cncf-io\") pod \"multus-275jz\" (UID: \"55a0340a-0400-482b-8422-3e0465f0802d\") " pod="openshift-multus/multus-275jz" Apr 23 13:31:51.067266 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.066286 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/edb014da-0558-4a2a-9f98-bea52a2c723e-var-lib-openvswitch\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.067266 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.066315 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/edb014da-0558-4a2a-9f98-bea52a2c723e-ovnkube-config\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.067266 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.066361 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/edb014da-0558-4a2a-9f98-bea52a2c723e-env-overrides\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.117227 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.117199 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 13:26:50 +0000 UTC" deadline="2028-01-14 05:10:27.410915738 +0000 UTC" Apr 23 13:31:51.117227 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.117225 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15135h38m36.293693221s" Apr 23 13:31:51.167441 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.167401 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/edb014da-0558-4a2a-9f98-bea52a2c723e-run-systemd\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.167580 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.167452 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/edb014da-0558-4a2a-9f98-bea52a2c723e-ovnkube-script-lib\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.167580 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.167477 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w9kns\" (UniqueName: \"kubernetes.io/projected/8ddfabc2-1040-4841-9473-ed5ba1c0c775-kube-api-access-w9kns\") pod \"node-resolver-cj9dc\" (UID: \"8ddfabc2-1040-4841-9473-ed5ba1c0c775\") " pod="openshift-dns/node-resolver-cj9dc" Apr 23 13:31:51.167580 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.167496 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/edb014da-0558-4a2a-9f98-bea52a2c723e-run-systemd\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.167580 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.167501 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9ada073-20d9-454e-b803-aef6be2e17c7-etc-kubernetes\") pod \"tuned-zzbbp\" (UID: \"d9ada073-20d9-454e-b803-aef6be2e17c7\") " pod="openshift-cluster-node-tuning-operator/tuned-zzbbp" Apr 23 13:31:51.167792 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.167569 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/55a0340a-0400-482b-8422-3e0465f0802d-multus-socket-dir-parent\") pod \"multus-275jz\" (UID: \"55a0340a-0400-482b-8422-3e0465f0802d\") " pod="openshift-multus/multus-275jz" Apr 23 13:31:51.167792 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.167612 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/55a0340a-0400-482b-8422-3e0465f0802d-host-run-k8s-cni-cncf-io\") pod \"multus-275jz\" (UID: \"55a0340a-0400-482b-8422-3e0465f0802d\") " pod="openshift-multus/multus-275jz" Apr 23 13:31:51.167792 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.167580 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9ada073-20d9-454e-b803-aef6be2e17c7-etc-kubernetes\") pod \"tuned-zzbbp\" (UID: \"d9ada073-20d9-454e-b803-aef6be2e17c7\") " pod="openshift-cluster-node-tuning-operator/tuned-zzbbp" Apr 23 13:31:51.167792 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.167637 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/edb014da-0558-4a2a-9f98-bea52a2c723e-var-lib-openvswitch\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.167792 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.167664 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/edb014da-0558-4a2a-9f98-bea52a2c723e-ovnkube-config\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.167792 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.167701 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/edb014da-0558-4a2a-9f98-bea52a2c723e-env-overrides\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.167792 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.167691 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/55a0340a-0400-482b-8422-3e0465f0802d-host-run-k8s-cni-cncf-io\") pod \"multus-275jz\" (UID: \"55a0340a-0400-482b-8422-3e0465f0802d\") " pod="openshift-multus/multus-275jz" Apr 23 13:31:51.167792 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.167706 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/55a0340a-0400-482b-8422-3e0465f0802d-multus-socket-dir-parent\") pod \"multus-275jz\" (UID: \"55a0340a-0400-482b-8422-3e0465f0802d\") " pod="openshift-multus/multus-275jz" Apr 23 13:31:51.167792 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.167727 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qg9z9\" (UniqueName: \"kubernetes.io/projected/0dcb0e48-f774-49cd-8b04-58a5050a5ff2-kube-api-access-qg9z9\") pod \"network-metrics-daemon-564gb\" (UID: \"0dcb0e48-f774-49cd-8b04-58a5050a5ff2\") " pod="openshift-multus/network-metrics-daemon-564gb" Apr 23 13:31:51.167792 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.167756 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d9ada073-20d9-454e-b803-aef6be2e17c7-host\") pod \"tuned-zzbbp\" (UID: \"d9ada073-20d9-454e-b803-aef6be2e17c7\") " pod="openshift-cluster-node-tuning-operator/tuned-zzbbp" Apr 23 13:31:51.167792 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.167781 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/55a0340a-0400-482b-8422-3e0465f0802d-multus-cni-dir\") pod \"multus-275jz\" (UID: \"55a0340a-0400-482b-8422-3e0465f0802d\") " pod="openshift-multus/multus-275jz" Apr 23 13:31:51.168224 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.167805 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/edb014da-0558-4a2a-9f98-bea52a2c723e-host-slash\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.168224 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.167854 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/edb014da-0558-4a2a-9f98-bea52a2c723e-var-lib-openvswitch\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.168224 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.167859 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/edb014da-0558-4a2a-9f98-bea52a2c723e-host-slash\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.168224 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.167861 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d9ada073-20d9-454e-b803-aef6be2e17c7-host\") pod \"tuned-zzbbp\" (UID: \"d9ada073-20d9-454e-b803-aef6be2e17c7\") " pod="openshift-cluster-node-tuning-operator/tuned-zzbbp" Apr 23 13:31:51.168224 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.167886 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/edb014da-0558-4a2a-9f98-bea52a2c723e-etc-openvswitch\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.169297 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.167887 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/55a0340a-0400-482b-8422-3e0465f0802d-multus-cni-dir\") pod \"multus-275jz\" (UID: \"55a0340a-0400-482b-8422-3e0465f0802d\") " pod="openshift-multus/multus-275jz" Apr 23 13:31:51.169297 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.168607 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/edb014da-0558-4a2a-9f98-bea52a2c723e-etc-openvswitch\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.169297 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.168644 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b7e52e35-9f8f-43be-b9d9-69181afa13ed-kubelet-dir\") pod \"aws-ebs-csi-driver-node-glcfn\" (UID: \"b7e52e35-9f8f-43be-b9d9-69181afa13ed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-glcfn" Apr 23 13:31:51.169297 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.168690 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b7e52e35-9f8f-43be-b9d9-69181afa13ed-sys-fs\") pod \"aws-ebs-csi-driver-node-glcfn\" (UID: \"b7e52e35-9f8f-43be-b9d9-69181afa13ed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-glcfn" Apr 23 13:31:51.169297 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.168725 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/098e4208-e230-428a-af72-f1aa64c09ce0-serviceca\") pod \"node-ca-jfnjq\" (UID: \"098e4208-e230-428a-af72-f1aa64c09ce0\") " pod="openshift-image-registry/node-ca-jfnjq" Apr 23 13:31:51.169297 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.168759 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d9ada073-20d9-454e-b803-aef6be2e17c7-sys\") pod \"tuned-zzbbp\" (UID: \"d9ada073-20d9-454e-b803-aef6be2e17c7\") " pod="openshift-cluster-node-tuning-operator/tuned-zzbbp" Apr 23 13:31:51.169297 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.168793 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515-original-pull-secret\") pod \"global-pull-secret-syncer-kjxcs\" (UID: \"2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515\") " pod="kube-system/global-pull-secret-syncer-kjxcs" Apr 23 13:31:51.169297 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.168820 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/55a0340a-0400-482b-8422-3e0465f0802d-host-run-netns\") pod \"multus-275jz\" (UID: \"55a0340a-0400-482b-8422-3e0465f0802d\") " pod="openshift-multus/multus-275jz" Apr 23 13:31:51.169297 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.168853 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8ddfabc2-1040-4841-9473-ed5ba1c0c775-hosts-file\") pod \"node-resolver-cj9dc\" (UID: \"8ddfabc2-1040-4841-9473-ed5ba1c0c775\") " pod="openshift-dns/node-resolver-cj9dc" Apr 23 13:31:51.169297 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.168877 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b7e52e35-9f8f-43be-b9d9-69181afa13ed-kubelet-dir\") pod \"aws-ebs-csi-driver-node-glcfn\" (UID: \"b7e52e35-9f8f-43be-b9d9-69181afa13ed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-glcfn" Apr 23 13:31:51.169297 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.168888 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/edb014da-0558-4a2a-9f98-bea52a2c723e-host-kubelet\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.169297 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.168922 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qzwvw\" (UniqueName: \"kubernetes.io/projected/edb014da-0558-4a2a-9f98-bea52a2c723e-kube-api-access-qzwvw\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.169297 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.168952 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b7e52e35-9f8f-43be-b9d9-69181afa13ed-device-dir\") pod \"aws-ebs-csi-driver-node-glcfn\" (UID: \"b7e52e35-9f8f-43be-b9d9-69181afa13ed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-glcfn" Apr 23 13:31:51.169297 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.168978 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/55a0340a-0400-482b-8422-3e0465f0802d-cnibin\") pod \"multus-275jz\" (UID: \"55a0340a-0400-482b-8422-3e0465f0802d\") " pod="openshift-multus/multus-275jz" Apr 23 13:31:51.169297 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.168984 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/edb014da-0558-4a2a-9f98-bea52a2c723e-ovnkube-script-lib\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.169297 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.168990 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/edb014da-0558-4a2a-9f98-bea52a2c723e-env-overrides\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.169297 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.169009 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/edb014da-0558-4a2a-9f98-bea52a2c723e-host-run-netns\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.169297 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:51.169014 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:51.170147 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.169044 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd-cnibin\") pod \"multus-additional-cni-plugins-57l24\" (UID: \"dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd\") " pod="openshift-multus/multus-additional-cni-plugins-57l24" Apr 23 13:31:51.170147 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.169110 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/edb014da-0558-4a2a-9f98-bea52a2c723e-host-kubelet\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.170147 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.168817 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b7e52e35-9f8f-43be-b9d9-69181afa13ed-sys-fs\") pod \"aws-ebs-csi-driver-node-glcfn\" (UID: \"b7e52e35-9f8f-43be-b9d9-69181afa13ed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-glcfn" Apr 23 13:31:51.170147 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.169186 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8ddfabc2-1040-4841-9473-ed5ba1c0c775-hosts-file\") pod \"node-resolver-cj9dc\" (UID: \"8ddfabc2-1040-4841-9473-ed5ba1c0c775\") " pod="openshift-dns/node-resolver-cj9dc" Apr 23 13:31:51.170147 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.169228 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd-cnibin\") pod \"multus-additional-cni-plugins-57l24\" (UID: \"dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd\") " pod="openshift-multus/multus-additional-cni-plugins-57l24" Apr 23 13:31:51.170147 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.169250 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/55a0340a-0400-482b-8422-3e0465f0802d-host-run-netns\") pod \"multus-275jz\" (UID: \"55a0340a-0400-482b-8422-3e0465f0802d\") " pod="openshift-multus/multus-275jz" Apr 23 13:31:51.170147 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.169340 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/edb014da-0558-4a2a-9f98-bea52a2c723e-host-run-netns\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.170147 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.169438 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d9ada073-20d9-454e-b803-aef6be2e17c7-sys\") pod \"tuned-zzbbp\" (UID: \"d9ada073-20d9-454e-b803-aef6be2e17c7\") " pod="openshift-cluster-node-tuning-operator/tuned-zzbbp" Apr 23 13:31:51.170147 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:51.169483 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515-original-pull-secret podName:2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:51.669426995 +0000 UTC m=+3.127886261 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515-original-pull-secret") pod "global-pull-secret-syncer-kjxcs" (UID: "2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515") : object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:51.170147 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.169088 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-57l24\" (UID: \"dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd\") " pod="openshift-multus/multus-additional-cni-plugins-57l24" Apr 23 13:31:51.170147 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.169673 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d9ada073-20d9-454e-b803-aef6be2e17c7-var-lib-kubelet\") pod \"tuned-zzbbp\" (UID: \"d9ada073-20d9-454e-b803-aef6be2e17c7\") " pod="openshift-cluster-node-tuning-operator/tuned-zzbbp" Apr 23 13:31:51.170147 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.169703 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wl5m7\" (UniqueName: \"kubernetes.io/projected/d9ada073-20d9-454e-b803-aef6be2e17c7-kube-api-access-wl5m7\") pod \"tuned-zzbbp\" (UID: \"d9ada073-20d9-454e-b803-aef6be2e17c7\") " pod="openshift-cluster-node-tuning-operator/tuned-zzbbp" Apr 23 13:31:51.170147 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.169734 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0dcb0e48-f774-49cd-8b04-58a5050a5ff2-metrics-certs\") pod \"network-metrics-daemon-564gb\" (UID: \"0dcb0e48-f774-49cd-8b04-58a5050a5ff2\") " pod="openshift-multus/network-metrics-daemon-564gb" Apr 23 13:31:51.170147 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.169622 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/edb014da-0558-4a2a-9f98-bea52a2c723e-ovnkube-config\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.170147 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.169762 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d9ada073-20d9-454e-b803-aef6be2e17c7-run\") pod \"tuned-zzbbp\" (UID: \"d9ada073-20d9-454e-b803-aef6be2e17c7\") " pod="openshift-cluster-node-tuning-operator/tuned-zzbbp" Apr 23 13:31:51.170147 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.169624 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/098e4208-e230-428a-af72-f1aa64c09ce0-serviceca\") pod \"node-ca-jfnjq\" (UID: \"098e4208-e230-428a-af72-f1aa64c09ce0\") " pod="openshift-image-registry/node-ca-jfnjq" Apr 23 13:31:51.170147 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.169836 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5hblj\" (UniqueName: \"kubernetes.io/projected/ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e-kube-api-access-5hblj\") pod \"network-check-target-dggd8\" (UID: \"ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e\") " pod="openshift-network-diagnostics/network-check-target-dggd8" Apr 23 13:31:51.170929 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.169845 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d9ada073-20d9-454e-b803-aef6be2e17c7-var-lib-kubelet\") pod \"tuned-zzbbp\" (UID: \"d9ada073-20d9-454e-b803-aef6be2e17c7\") " pod="openshift-cluster-node-tuning-operator/tuned-zzbbp" Apr 23 13:31:51.170929 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.169868 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b7e52e35-9f8f-43be-b9d9-69181afa13ed-device-dir\") pod \"aws-ebs-csi-driver-node-glcfn\" (UID: \"b7e52e35-9f8f-43be-b9d9-69181afa13ed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-glcfn" Apr 23 13:31:51.170929 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:51.169994 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:51.170929 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:51.170054 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0dcb0e48-f774-49cd-8b04-58a5050a5ff2-metrics-certs podName:0dcb0e48-f774-49cd-8b04-58a5050a5ff2 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:51.670031637 +0000 UTC m=+3.128490889 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0dcb0e48-f774-49cd-8b04-58a5050a5ff2-metrics-certs") pod "network-metrics-daemon-564gb" (UID: "0dcb0e48-f774-49cd-8b04-58a5050a5ff2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:51.170929 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.170108 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d9ada073-20d9-454e-b803-aef6be2e17c7-run\") pod \"tuned-zzbbp\" (UID: \"d9ada073-20d9-454e-b803-aef6be2e17c7\") " pod="openshift-cluster-node-tuning-operator/tuned-zzbbp" Apr 23 13:31:51.170929 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.170155 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/55a0340a-0400-482b-8422-3e0465f0802d-cnibin\") pod \"multus-275jz\" (UID: \"55a0340a-0400-482b-8422-3e0465f0802d\") " pod="openshift-multus/multus-275jz" Apr 23 13:31:51.170929 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.170192 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/edb014da-0558-4a2a-9f98-bea52a2c723e-node-log\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.170929 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.170237 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/edb014da-0558-4a2a-9f98-bea52a2c723e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.170929 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.170269 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d9ada073-20d9-454e-b803-aef6be2e17c7-tmp\") pod \"tuned-zzbbp\" (UID: \"d9ada073-20d9-454e-b803-aef6be2e17c7\") " pod="openshift-cluster-node-tuning-operator/tuned-zzbbp" Apr 23 13:31:51.170929 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.170299 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/55a0340a-0400-482b-8422-3e0465f0802d-system-cni-dir\") pod \"multus-275jz\" (UID: \"55a0340a-0400-482b-8422-3e0465f0802d\") " pod="openshift-multus/multus-275jz" Apr 23 13:31:51.170929 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.170328 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/55a0340a-0400-482b-8422-3e0465f0802d-cni-binary-copy\") pod \"multus-275jz\" (UID: \"55a0340a-0400-482b-8422-3e0465f0802d\") " pod="openshift-multus/multus-275jz" Apr 23 13:31:51.170929 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.170352 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/edb014da-0558-4a2a-9f98-bea52a2c723e-systemd-units\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.170929 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.170384 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/324be42f-87e9-413c-a39b-1c5ebac3ad6d-host-slash\") pod \"iptables-alerter-66br2\" (UID: \"324be42f-87e9-413c-a39b-1c5ebac3ad6d\") " pod="openshift-network-operator/iptables-alerter-66br2" Apr 23 13:31:51.170929 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.170397 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-57l24\" (UID: \"dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd\") " pod="openshift-multus/multus-additional-cni-plugins-57l24" Apr 23 13:31:51.170929 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.170433 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515-kubelet-config\") pod \"global-pull-secret-syncer-kjxcs\" (UID: \"2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515\") " pod="kube-system/global-pull-secret-syncer-kjxcs" Apr 23 13:31:51.170929 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.170468 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/55a0340a-0400-482b-8422-3e0465f0802d-os-release\") pod \"multus-275jz\" (UID: \"55a0340a-0400-482b-8422-3e0465f0802d\") " pod="openshift-multus/multus-275jz" Apr 23 13:31:51.170929 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.170490 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/edb014da-0558-4a2a-9f98-bea52a2c723e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.171666 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.170498 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/55a0340a-0400-482b-8422-3e0465f0802d-hostroot\") pod \"multus-275jz\" (UID: \"55a0340a-0400-482b-8422-3e0465f0802d\") " pod="openshift-multus/multus-275jz" Apr 23 13:31:51.171666 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.170529 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/edb014da-0558-4a2a-9f98-bea52a2c723e-host-cni-bin\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.171666 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.170559 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515-kubelet-config\") pod \"global-pull-secret-syncer-kjxcs\" (UID: \"2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515\") " pod="kube-system/global-pull-secret-syncer-kjxcs" Apr 23 13:31:51.171666 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.170562 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/349fe628-9a5e-4f45-bd89-4d75157af516-konnectivity-ca\") pod \"konnectivity-agent-62dc8\" (UID: \"349fe628-9a5e-4f45-bd89-4d75157af516\") " pod="kube-system/konnectivity-agent-62dc8" Apr 23 13:31:51.171666 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.170637 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/324be42f-87e9-413c-a39b-1c5ebac3ad6d-host-slash\") pod \"iptables-alerter-66br2\" (UID: \"324be42f-87e9-413c-a39b-1c5ebac3ad6d\") " pod="openshift-network-operator/iptables-alerter-66br2" Apr 23 13:31:51.171666 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.170674 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zb745\" (UniqueName: \"kubernetes.io/projected/324be42f-87e9-413c-a39b-1c5ebac3ad6d-kube-api-access-zb745\") pod \"iptables-alerter-66br2\" (UID: \"324be42f-87e9-413c-a39b-1c5ebac3ad6d\") " pod="openshift-network-operator/iptables-alerter-66br2" Apr 23 13:31:51.171666 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.170710 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd-system-cni-dir\") pod \"multus-additional-cni-plugins-57l24\" (UID: \"dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd\") " pod="openshift-multus/multus-additional-cni-plugins-57l24" Apr 23 13:31:51.171666 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.170747 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd-cni-binary-copy\") pod \"multus-additional-cni-plugins-57l24\" (UID: \"dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd\") " pod="openshift-multus/multus-additional-cni-plugins-57l24" Apr 23 13:31:51.171666 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.170755 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 13:31:51.171666 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.170776 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/55a0340a-0400-482b-8422-3e0465f0802d-host-var-lib-kubelet\") pod \"multus-275jz\" (UID: \"55a0340a-0400-482b-8422-3e0465f0802d\") " pod="openshift-multus/multus-275jz" Apr 23 13:31:51.171666 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.170806 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/edb014da-0558-4a2a-9f98-bea52a2c723e-run-ovn\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.171666 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.170836 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/edb014da-0558-4a2a-9f98-bea52a2c723e-host-run-ovn-kubernetes\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.171666 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.170864 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8ddfabc2-1040-4841-9473-ed5ba1c0c775-tmp-dir\") pod \"node-resolver-cj9dc\" (UID: \"8ddfabc2-1040-4841-9473-ed5ba1c0c775\") " pod="openshift-dns/node-resolver-cj9dc" Apr 23 13:31:51.171666 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.170890 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b7e52e35-9f8f-43be-b9d9-69181afa13ed-etc-selinux\") pod \"aws-ebs-csi-driver-node-glcfn\" (UID: \"b7e52e35-9f8f-43be-b9d9-69181afa13ed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-glcfn" Apr 23 13:31:51.171666 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.170914 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d9ada073-20d9-454e-b803-aef6be2e17c7-etc-systemd\") pod \"tuned-zzbbp\" (UID: \"d9ada073-20d9-454e-b803-aef6be2e17c7\") " pod="openshift-cluster-node-tuning-operator/tuned-zzbbp" Apr 23 13:31:51.171666 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.170949 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-57l24\" (UID: \"dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd\") " pod="openshift-multus/multus-additional-cni-plugins-57l24" Apr 23 13:31:51.171666 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.170985 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/55a0340a-0400-482b-8422-3e0465f0802d-multus-daemon-config\") pod \"multus-275jz\" (UID: \"55a0340a-0400-482b-8422-3e0465f0802d\") " pod="openshift-multus/multus-275jz" Apr 23 13:31:51.172377 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.171014 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/edb014da-0558-4a2a-9f98-bea52a2c723e-log-socket\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.172377 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.171021 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/349fe628-9a5e-4f45-bd89-4d75157af516-konnectivity-ca\") pod \"konnectivity-agent-62dc8\" (UID: \"349fe628-9a5e-4f45-bd89-4d75157af516\") " pod="kube-system/konnectivity-agent-62dc8" Apr 23 13:31:51.172377 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.171045 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/349fe628-9a5e-4f45-bd89-4d75157af516-agent-certs\") pod \"konnectivity-agent-62dc8\" (UID: \"349fe628-9a5e-4f45-bd89-4d75157af516\") " pod="kube-system/konnectivity-agent-62dc8" Apr 23 13:31:51.172377 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.171101 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/55a0340a-0400-482b-8422-3e0465f0802d-system-cni-dir\") pod \"multus-275jz\" (UID: \"55a0340a-0400-482b-8422-3e0465f0802d\") " pod="openshift-multus/multus-275jz" Apr 23 13:31:51.172377 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.171140 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5dxxk\" (UniqueName: \"kubernetes.io/projected/b7e52e35-9f8f-43be-b9d9-69181afa13ed-kube-api-access-5dxxk\") pod \"aws-ebs-csi-driver-node-glcfn\" (UID: \"b7e52e35-9f8f-43be-b9d9-69181afa13ed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-glcfn" Apr 23 13:31:51.172377 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.171174 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b7e52e35-9f8f-43be-b9d9-69181afa13ed-registration-dir\") pod \"aws-ebs-csi-driver-node-glcfn\" (UID: \"b7e52e35-9f8f-43be-b9d9-69181afa13ed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-glcfn" Apr 23 13:31:51.172377 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.171200 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/324be42f-87e9-413c-a39b-1c5ebac3ad6d-iptables-alerter-script\") pod \"iptables-alerter-66br2\" (UID: \"324be42f-87e9-413c-a39b-1c5ebac3ad6d\") " pod="openshift-network-operator/iptables-alerter-66br2" Apr 23 13:31:51.172377 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.171232 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd-os-release\") pod \"multus-additional-cni-plugins-57l24\" (UID: \"dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd\") " pod="openshift-multus/multus-additional-cni-plugins-57l24" Apr 23 13:31:51.172377 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.171262 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/edb014da-0558-4a2a-9f98-bea52a2c723e-ovn-node-metrics-cert\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.172377 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.171292 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k9gsd\" (UniqueName: \"kubernetes.io/projected/098e4208-e230-428a-af72-f1aa64c09ce0-kube-api-access-k9gsd\") pod \"node-ca-jfnjq\" (UID: \"098e4208-e230-428a-af72-f1aa64c09ce0\") " pod="openshift-image-registry/node-ca-jfnjq" Apr 23 13:31:51.172377 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.171325 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-txm7v\" (UniqueName: \"kubernetes.io/projected/dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd-kube-api-access-txm7v\") pod \"multus-additional-cni-plugins-57l24\" (UID: \"dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd\") " pod="openshift-multus/multus-additional-cni-plugins-57l24" Apr 23 13:31:51.172377 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.171376 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c7vlz\" (UniqueName: \"kubernetes.io/projected/55a0340a-0400-482b-8422-3e0465f0802d-kube-api-access-c7vlz\") pod \"multus-275jz\" (UID: \"55a0340a-0400-482b-8422-3e0465f0802d\") " pod="openshift-multus/multus-275jz" Apr 23 13:31:51.172377 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.171425 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d9ada073-20d9-454e-b803-aef6be2e17c7-etc-modprobe-d\") pod \"tuned-zzbbp\" (UID: \"d9ada073-20d9-454e-b803-aef6be2e17c7\") " pod="openshift-cluster-node-tuning-operator/tuned-zzbbp" Apr 23 13:31:51.172377 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.171458 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d9ada073-20d9-454e-b803-aef6be2e17c7-etc-sysconfig\") pod \"tuned-zzbbp\" (UID: \"d9ada073-20d9-454e-b803-aef6be2e17c7\") " pod="openshift-cluster-node-tuning-operator/tuned-zzbbp" Apr 23 13:31:51.172377 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.171484 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d9ada073-20d9-454e-b803-aef6be2e17c7-etc-tuned\") pod \"tuned-zzbbp\" (UID: \"d9ada073-20d9-454e-b803-aef6be2e17c7\") " pod="openshift-cluster-node-tuning-operator/tuned-zzbbp" Apr 23 13:31:51.172377 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.171511 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515-dbus\") pod \"global-pull-secret-syncer-kjxcs\" (UID: \"2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515\") " pod="kube-system/global-pull-secret-syncer-kjxcs" Apr 23 13:31:51.172377 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.171541 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/edb014da-0558-4a2a-9f98-bea52a2c723e-host-cni-netd\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.173155 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.171571 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/55a0340a-0400-482b-8422-3e0465f0802d-host-var-lib-cni-bin\") pod \"multus-275jz\" (UID: \"55a0340a-0400-482b-8422-3e0465f0802d\") " pod="openshift-multus/multus-275jz" Apr 23 13:31:51.173155 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.171601 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/edb014da-0558-4a2a-9f98-bea52a2c723e-run-openvswitch\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.173155 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.171627 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b7e52e35-9f8f-43be-b9d9-69181afa13ed-socket-dir\") pod \"aws-ebs-csi-driver-node-glcfn\" (UID: \"b7e52e35-9f8f-43be-b9d9-69181afa13ed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-glcfn" Apr 23 13:31:51.173155 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.171658 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/098e4208-e230-428a-af72-f1aa64c09ce0-host\") pod \"node-ca-jfnjq\" (UID: \"098e4208-e230-428a-af72-f1aa64c09ce0\") " pod="openshift-image-registry/node-ca-jfnjq" Apr 23 13:31:51.173155 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.171690 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd-tuning-conf-dir\") pod \"multus-additional-cni-plugins-57l24\" (UID: \"dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd\") " pod="openshift-multus/multus-additional-cni-plugins-57l24" Apr 23 13:31:51.173155 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.171722 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/55a0340a-0400-482b-8422-3e0465f0802d-host-var-lib-cni-multus\") pod \"multus-275jz\" (UID: \"55a0340a-0400-482b-8422-3e0465f0802d\") " pod="openshift-multus/multus-275jz" Apr 23 13:31:51.173155 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.171753 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/55a0340a-0400-482b-8422-3e0465f0802d-multus-conf-dir\") pod \"multus-275jz\" (UID: \"55a0340a-0400-482b-8422-3e0465f0802d\") " pod="openshift-multus/multus-275jz" Apr 23 13:31:51.173155 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.171778 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d9ada073-20d9-454e-b803-aef6be2e17c7-etc-sysctl-d\") pod \"tuned-zzbbp\" (UID: \"d9ada073-20d9-454e-b803-aef6be2e17c7\") " pod="openshift-cluster-node-tuning-operator/tuned-zzbbp" Apr 23 13:31:51.173155 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.171807 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d9ada073-20d9-454e-b803-aef6be2e17c7-etc-sysctl-conf\") pod \"tuned-zzbbp\" (UID: \"d9ada073-20d9-454e-b803-aef6be2e17c7\") " pod="openshift-cluster-node-tuning-operator/tuned-zzbbp" Apr 23 13:31:51.173155 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.171834 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d9ada073-20d9-454e-b803-aef6be2e17c7-lib-modules\") pod \"tuned-zzbbp\" (UID: \"d9ada073-20d9-454e-b803-aef6be2e17c7\") " pod="openshift-cluster-node-tuning-operator/tuned-zzbbp" Apr 23 13:31:51.173155 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.171864 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/55a0340a-0400-482b-8422-3e0465f0802d-host-run-multus-certs\") pod \"multus-275jz\" (UID: \"55a0340a-0400-482b-8422-3e0465f0802d\") " pod="openshift-multus/multus-275jz" Apr 23 13:31:51.173155 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.171891 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55a0340a-0400-482b-8422-3e0465f0802d-etc-kubernetes\") pod \"multus-275jz\" (UID: \"55a0340a-0400-482b-8422-3e0465f0802d\") " pod="openshift-multus/multus-275jz" Apr 23 13:31:51.173155 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.172044 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/edb014da-0558-4a2a-9f98-bea52a2c723e-host-run-ovn-kubernetes\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.173155 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.172138 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55a0340a-0400-482b-8422-3e0465f0802d-etc-kubernetes\") pod \"multus-275jz\" (UID: \"55a0340a-0400-482b-8422-3e0465f0802d\") " pod="openshift-multus/multus-275jz" Apr 23 13:31:51.173155 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.172266 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/edb014da-0558-4a2a-9f98-bea52a2c723e-run-ovn\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.173155 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.172491 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d9ada073-20d9-454e-b803-aef6be2e17c7-etc-systemd\") pod \"tuned-zzbbp\" (UID: \"d9ada073-20d9-454e-b803-aef6be2e17c7\") " pod="openshift-cluster-node-tuning-operator/tuned-zzbbp" Apr 23 13:31:51.173155 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.172803 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8ddfabc2-1040-4841-9473-ed5ba1c0c775-tmp-dir\") pod \"node-resolver-cj9dc\" (UID: \"8ddfabc2-1040-4841-9473-ed5ba1c0c775\") " pod="openshift-dns/node-resolver-cj9dc" Apr 23 13:31:51.173155 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.172821 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/55a0340a-0400-482b-8422-3e0465f0802d-multus-daemon-config\") pod \"multus-275jz\" (UID: \"55a0340a-0400-482b-8422-3e0465f0802d\") " pod="openshift-multus/multus-275jz" Apr 23 13:31:51.173948 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.172834 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-57l24\" (UID: \"dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd\") " pod="openshift-multus/multus-additional-cni-plugins-57l24" Apr 23 13:31:51.173948 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.172953 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b7e52e35-9f8f-43be-b9d9-69181afa13ed-etc-selinux\") pod \"aws-ebs-csi-driver-node-glcfn\" (UID: \"b7e52e35-9f8f-43be-b9d9-69181afa13ed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-glcfn" Apr 23 13:31:51.173948 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.172926 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d9ada073-20d9-454e-b803-aef6be2e17c7-etc-sysconfig\") pod \"tuned-zzbbp\" (UID: \"d9ada073-20d9-454e-b803-aef6be2e17c7\") " pod="openshift-cluster-node-tuning-operator/tuned-zzbbp" Apr 23 13:31:51.173948 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.172997 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/55a0340a-0400-482b-8422-3e0465f0802d-os-release\") pod \"multus-275jz\" (UID: \"55a0340a-0400-482b-8422-3e0465f0802d\") " pod="openshift-multus/multus-275jz" Apr 23 13:31:51.173948 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.173067 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/edb014da-0558-4a2a-9f98-bea52a2c723e-host-cni-bin\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.173948 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.173072 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/55a0340a-0400-482b-8422-3e0465f0802d-host-var-lib-kubelet\") pod \"multus-275jz\" (UID: \"55a0340a-0400-482b-8422-3e0465f0802d\") " pod="openshift-multus/multus-275jz" Apr 23 13:31:51.173948 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.173274 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd-system-cni-dir\") pod \"multus-additional-cni-plugins-57l24\" (UID: \"dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd\") " pod="openshift-multus/multus-additional-cni-plugins-57l24" Apr 23 13:31:51.173948 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.173270 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/55a0340a-0400-482b-8422-3e0465f0802d-cni-binary-copy\") pod \"multus-275jz\" (UID: \"55a0340a-0400-482b-8422-3e0465f0802d\") " pod="openshift-multus/multus-275jz" Apr 23 13:31:51.173948 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.173330 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/55a0340a-0400-482b-8422-3e0465f0802d-hostroot\") pod \"multus-275jz\" (UID: \"55a0340a-0400-482b-8422-3e0465f0802d\") " pod="openshift-multus/multus-275jz" Apr 23 13:31:51.173948 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.173273 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd-cni-binary-copy\") pod \"multus-additional-cni-plugins-57l24\" (UID: \"dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd\") " pod="openshift-multus/multus-additional-cni-plugins-57l24" Apr 23 13:31:51.173948 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.173483 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b7e52e35-9f8f-43be-b9d9-69181afa13ed-registration-dir\") pod \"aws-ebs-csi-driver-node-glcfn\" (UID: \"b7e52e35-9f8f-43be-b9d9-69181afa13ed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-glcfn" Apr 23 13:31:51.173948 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.173494 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd-os-release\") pod \"multus-additional-cni-plugins-57l24\" (UID: \"dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd\") " pod="openshift-multus/multus-additional-cni-plugins-57l24" Apr 23 13:31:51.173948 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.173622 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d9ada073-20d9-454e-b803-aef6be2e17c7-etc-sysctl-d\") pod \"tuned-zzbbp\" (UID: \"d9ada073-20d9-454e-b803-aef6be2e17c7\") " pod="openshift-cluster-node-tuning-operator/tuned-zzbbp" Apr 23 13:31:51.173948 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.173738 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd-tuning-conf-dir\") pod \"multus-additional-cni-plugins-57l24\" (UID: \"dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd\") " pod="openshift-multus/multus-additional-cni-plugins-57l24" Apr 23 13:31:51.173948 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.173809 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/55a0340a-0400-482b-8422-3e0465f0802d-host-var-lib-cni-multus\") pod \"multus-275jz\" (UID: \"55a0340a-0400-482b-8422-3e0465f0802d\") " pod="openshift-multus/multus-275jz" Apr 23 13:31:51.173948 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.173854 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/edb014da-0558-4a2a-9f98-bea52a2c723e-log-socket\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.173948 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.173866 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/55a0340a-0400-482b-8422-3e0465f0802d-multus-conf-dir\") pod \"multus-275jz\" (UID: \"55a0340a-0400-482b-8422-3e0465f0802d\") " pod="openshift-multus/multus-275jz" Apr 23 13:31:51.174716 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.173946 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d9ada073-20d9-454e-b803-aef6be2e17c7-etc-sysctl-conf\") pod \"tuned-zzbbp\" (UID: \"d9ada073-20d9-454e-b803-aef6be2e17c7\") " pod="openshift-cluster-node-tuning-operator/tuned-zzbbp" Apr 23 13:31:51.174716 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.173958 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d9ada073-20d9-454e-b803-aef6be2e17c7-lib-modules\") pod \"tuned-zzbbp\" (UID: \"d9ada073-20d9-454e-b803-aef6be2e17c7\") " pod="openshift-cluster-node-tuning-operator/tuned-zzbbp" Apr 23 13:31:51.174716 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.173995 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/324be42f-87e9-413c-a39b-1c5ebac3ad6d-iptables-alerter-script\") pod \"iptables-alerter-66br2\" (UID: \"324be42f-87e9-413c-a39b-1c5ebac3ad6d\") " pod="openshift-network-operator/iptables-alerter-66br2" Apr 23 13:31:51.174716 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.173995 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/55a0340a-0400-482b-8422-3e0465f0802d-host-var-lib-cni-bin\") pod \"multus-275jz\" (UID: \"55a0340a-0400-482b-8422-3e0465f0802d\") " pod="openshift-multus/multus-275jz" Apr 23 13:31:51.174716 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.174036 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/098e4208-e230-428a-af72-f1aa64c09ce0-host\") pod \"node-ca-jfnjq\" (UID: \"098e4208-e230-428a-af72-f1aa64c09ce0\") " pod="openshift-image-registry/node-ca-jfnjq" Apr 23 13:31:51.174716 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.174141 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/edb014da-0558-4a2a-9f98-bea52a2c723e-node-log\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.174716 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.174185 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/55a0340a-0400-482b-8422-3e0465f0802d-host-run-multus-certs\") pod \"multus-275jz\" (UID: \"55a0340a-0400-482b-8422-3e0465f0802d\") " pod="openshift-multus/multus-275jz" Apr 23 13:31:51.174716 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.174317 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b7e52e35-9f8f-43be-b9d9-69181afa13ed-socket-dir\") pod \"aws-ebs-csi-driver-node-glcfn\" (UID: \"b7e52e35-9f8f-43be-b9d9-69181afa13ed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-glcfn" Apr 23 13:31:51.174716 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.174456 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/edb014da-0558-4a2a-9f98-bea52a2c723e-systemd-units\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.174716 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.174460 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d9ada073-20d9-454e-b803-aef6be2e17c7-etc-modprobe-d\") pod \"tuned-zzbbp\" (UID: \"d9ada073-20d9-454e-b803-aef6be2e17c7\") " pod="openshift-cluster-node-tuning-operator/tuned-zzbbp" Apr 23 13:31:51.174716 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.174529 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/edb014da-0558-4a2a-9f98-bea52a2c723e-run-openvswitch\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.174716 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.174581 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/edb014da-0558-4a2a-9f98-bea52a2c723e-host-cni-netd\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.174716 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.174718 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515-dbus\") pod \"global-pull-secret-syncer-kjxcs\" (UID: \"2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515\") " pod="kube-system/global-pull-secret-syncer-kjxcs" Apr 23 13:31:51.176876 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.176666 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/edb014da-0558-4a2a-9f98-bea52a2c723e-ovn-node-metrics-cert\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.176876 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.176675 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d9ada073-20d9-454e-b803-aef6be2e17c7-tmp\") pod \"tuned-zzbbp\" (UID: \"d9ada073-20d9-454e-b803-aef6be2e17c7\") " pod="openshift-cluster-node-tuning-operator/tuned-zzbbp" Apr 23 13:31:51.176876 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.176717 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/349fe628-9a5e-4f45-bd89-4d75157af516-agent-certs\") pod \"konnectivity-agent-62dc8\" (UID: \"349fe628-9a5e-4f45-bd89-4d75157af516\") " pod="kube-system/konnectivity-agent-62dc8" Apr 23 13:31:51.177532 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.177377 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d9ada073-20d9-454e-b803-aef6be2e17c7-etc-tuned\") pod \"tuned-zzbbp\" (UID: \"d9ada073-20d9-454e-b803-aef6be2e17c7\") " pod="openshift-cluster-node-tuning-operator/tuned-zzbbp" Apr 23 13:31:51.179097 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.179035 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9kns\" (UniqueName: \"kubernetes.io/projected/8ddfabc2-1040-4841-9473-ed5ba1c0c775-kube-api-access-w9kns\") pod \"node-resolver-cj9dc\" (UID: \"8ddfabc2-1040-4841-9473-ed5ba1c0c775\") " pod="openshift-dns/node-resolver-cj9dc" Apr 23 13:31:51.179340 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:51.179156 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:31:51.179340 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:51.179178 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:31:51.179340 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:51.179193 2576 projected.go:194] Error preparing data for projected volume kube-api-access-5hblj for pod openshift-network-diagnostics/network-check-target-dggd8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:51.179340 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.179235 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wl5m7\" (UniqueName: \"kubernetes.io/projected/d9ada073-20d9-454e-b803-aef6be2e17c7-kube-api-access-wl5m7\") pod \"tuned-zzbbp\" (UID: \"d9ada073-20d9-454e-b803-aef6be2e17c7\") " pod="openshift-cluster-node-tuning-operator/tuned-zzbbp" Apr 23 13:31:51.179340 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:51.179265 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e-kube-api-access-5hblj podName:ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e nodeName:}" failed. No retries permitted until 2026-04-23 13:31:51.679247827 +0000 UTC m=+3.137707096 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-5hblj" (UniqueName: "kubernetes.io/projected/ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e-kube-api-access-5hblj") pod "network-check-target-dggd8" (UID: "ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:51.180168 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.180148 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg9z9\" (UniqueName: \"kubernetes.io/projected/0dcb0e48-f774-49cd-8b04-58a5050a5ff2-kube-api-access-qg9z9\") pod \"network-metrics-daemon-564gb\" (UID: \"0dcb0e48-f774-49cd-8b04-58a5050a5ff2\") " pod="openshift-multus/network-metrics-daemon-564gb" Apr 23 13:31:51.182236 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.182211 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dxxk\" (UniqueName: \"kubernetes.io/projected/b7e52e35-9f8f-43be-b9d9-69181afa13ed-kube-api-access-5dxxk\") pod \"aws-ebs-csi-driver-node-glcfn\" (UID: \"b7e52e35-9f8f-43be-b9d9-69181afa13ed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-glcfn" Apr 23 13:31:51.184188 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.184166 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzwvw\" (UniqueName: \"kubernetes.io/projected/edb014da-0558-4a2a-9f98-bea52a2c723e-kube-api-access-qzwvw\") pod \"ovnkube-node-72hmc\" (UID: \"edb014da-0558-4a2a-9f98-bea52a2c723e\") " pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.184296 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.184274 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7vlz\" (UniqueName: \"kubernetes.io/projected/55a0340a-0400-482b-8422-3e0465f0802d-kube-api-access-c7vlz\") pod \"multus-275jz\" (UID: \"55a0340a-0400-482b-8422-3e0465f0802d\") " pod="openshift-multus/multus-275jz" Apr 23 13:31:51.184798 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.184774 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb745\" (UniqueName: \"kubernetes.io/projected/324be42f-87e9-413c-a39b-1c5ebac3ad6d-kube-api-access-zb745\") pod \"iptables-alerter-66br2\" (UID: \"324be42f-87e9-413c-a39b-1c5ebac3ad6d\") " pod="openshift-network-operator/iptables-alerter-66br2" Apr 23 13:31:51.185261 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.185212 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9gsd\" (UniqueName: \"kubernetes.io/projected/098e4208-e230-428a-af72-f1aa64c09ce0-kube-api-access-k9gsd\") pod \"node-ca-jfnjq\" (UID: \"098e4208-e230-428a-af72-f1aa64c09ce0\") " pod="openshift-image-registry/node-ca-jfnjq" Apr 23 13:31:51.185361 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.185341 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-txm7v\" (UniqueName: \"kubernetes.io/projected/dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd-kube-api-access-txm7v\") pod \"multus-additional-cni-plugins-57l24\" (UID: \"dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd\") " pod="openshift-multus/multus-additional-cni-plugins-57l24" Apr 23 13:31:51.325909 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.325818 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:31:51.353965 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.353935 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-zzbbp" Apr 23 13:31:51.361945 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.361924 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-cj9dc" Apr 23 13:31:51.371586 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.371563 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-62dc8" Apr 23 13:31:51.374282 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.374261 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-glcfn" Apr 23 13:31:51.381778 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.381758 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-57l24" Apr 23 13:31:51.387449 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.387427 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-275jz" Apr 23 13:31:51.393097 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.393079 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:31:51.400683 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.400665 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-66br2" Apr 23 13:31:51.406277 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.406254 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jfnjq" Apr 23 13:31:51.675690 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.675605 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515-original-pull-secret\") pod \"global-pull-secret-syncer-kjxcs\" (UID: \"2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515\") " pod="kube-system/global-pull-secret-syncer-kjxcs" Apr 23 13:31:51.675690 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.675674 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0dcb0e48-f774-49cd-8b04-58a5050a5ff2-metrics-certs\") pod \"network-metrics-daemon-564gb\" (UID: \"0dcb0e48-f774-49cd-8b04-58a5050a5ff2\") " pod="openshift-multus/network-metrics-daemon-564gb" Apr 23 13:31:51.675921 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:51.675779 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:51.675921 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:51.675820 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:51.675921 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:51.675857 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515-original-pull-secret podName:2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:52.675839173 +0000 UTC m=+4.134298446 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515-original-pull-secret") pod "global-pull-secret-syncer-kjxcs" (UID: "2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515") : object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:51.675921 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:51.675879 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0dcb0e48-f774-49cd-8b04-58a5050a5ff2-metrics-certs podName:0dcb0e48-f774-49cd-8b04-58a5050a5ff2 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:52.675868568 +0000 UTC m=+4.134327825 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0dcb0e48-f774-49cd-8b04-58a5050a5ff2-metrics-certs") pod "network-metrics-daemon-564gb" (UID: "0dcb0e48-f774-49cd-8b04-58a5050a5ff2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:51.776497 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:51.776472 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5hblj\" (UniqueName: \"kubernetes.io/projected/ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e-kube-api-access-5hblj\") pod \"network-check-target-dggd8\" (UID: \"ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e\") " pod="openshift-network-diagnostics/network-check-target-dggd8" Apr 23 13:31:51.776691 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:51.776671 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:31:51.776777 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:51.776699 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:31:51.776777 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:51.776712 2576 projected.go:194] Error preparing data for projected volume kube-api-access-5hblj for pod openshift-network-diagnostics/network-check-target-dggd8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:51.776777 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:51.776764 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e-kube-api-access-5hblj podName:ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e nodeName:}" failed. No retries permitted until 2026-04-23 13:31:52.776746257 +0000 UTC m=+4.235205510 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-5hblj" (UniqueName: "kubernetes.io/projected/ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e-kube-api-access-5hblj") pod "network-check-target-dggd8" (UID: "ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:51.781024 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:51.780993 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod349fe628_9a5e_4f45_bd89_4d75157af516.slice/crio-7917b8063b21125ef827d8d69995037a90d48820a38be00514369c998bee7db1 WatchSource:0}: Error finding container 7917b8063b21125ef827d8d69995037a90d48820a38be00514369c998bee7db1: Status 404 returned error can't find the container with id 7917b8063b21125ef827d8d69995037a90d48820a38be00514369c998bee7db1 Apr 23 13:31:51.782489 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:51.782458 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod324be42f_87e9_413c_a39b_1c5ebac3ad6d.slice/crio-1a21429459c698450a9a3ee82ab0ea80e78a5959d02ea251800bbb876bbaccb3 WatchSource:0}: Error finding container 1a21429459c698450a9a3ee82ab0ea80e78a5959d02ea251800bbb876bbaccb3: Status 404 returned error can't find the container with id 1a21429459c698450a9a3ee82ab0ea80e78a5959d02ea251800bbb876bbaccb3 Apr 23 13:31:51.783317 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:51.783252 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ddfabc2_1040_4841_9473_ed5ba1c0c775.slice/crio-8408fa49bb3b899a8ff60c46db2d2ff2fd5a166f0cf8ae9e25d6871b1c908486 WatchSource:0}: Error finding container 8408fa49bb3b899a8ff60c46db2d2ff2fd5a166f0cf8ae9e25d6871b1c908486: Status 404 returned error can't find the container with id 8408fa49bb3b899a8ff60c46db2d2ff2fd5a166f0cf8ae9e25d6871b1c908486 Apr 23 13:31:51.784961 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:51.784920 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9ada073_20d9_454e_b803_aef6be2e17c7.slice/crio-5fe55517acc9de7ab37641bae52150a62a317babfa7b6467f082732f048a1f4c WatchSource:0}: Error finding container 5fe55517acc9de7ab37641bae52150a62a317babfa7b6467f082732f048a1f4c: Status 404 returned error can't find the container with id 5fe55517acc9de7ab37641bae52150a62a317babfa7b6467f082732f048a1f4c Apr 23 13:31:51.787439 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:51.787400 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod098e4208_e230_428a_af72_f1aa64c09ce0.slice/crio-b467c14cf44ee78804691b0d705cb869434fc8f1143e880dcad6f3d424687976 WatchSource:0}: Error finding container b467c14cf44ee78804691b0d705cb869434fc8f1143e880dcad6f3d424687976: Status 404 returned error can't find the container with id b467c14cf44ee78804691b0d705cb869434fc8f1143e880dcad6f3d424687976 Apr 23 13:31:51.788965 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:51.788942 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc7fc0e6_0ff2_4765_b7da_e0e49fb829cd.slice/crio-8d0bcf773c04dd10d0af0d2ebc5b8683340e9ad88a39361e402ccd5e00426d48 WatchSource:0}: Error finding container 8d0bcf773c04dd10d0af0d2ebc5b8683340e9ad88a39361e402ccd5e00426d48: Status 404 returned error can't find the container with id 8d0bcf773c04dd10d0af0d2ebc5b8683340e9ad88a39361e402ccd5e00426d48 Apr 23 13:31:51.790175 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:51.790104 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55a0340a_0400_482b_8422_3e0465f0802d.slice/crio-d7fe8a0ebb38b96981753ffa88e64d5120b73bdbdbddf849aacfc417fdfa0b71 WatchSource:0}: Error finding container d7fe8a0ebb38b96981753ffa88e64d5120b73bdbdbddf849aacfc417fdfa0b71: Status 404 returned error can't find the container with id d7fe8a0ebb38b96981753ffa88e64d5120b73bdbdbddf849aacfc417fdfa0b71 Apr 23 13:31:51.792112 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:51.792084 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedb014da_0558_4a2a_9f98_bea52a2c723e.slice/crio-85666d6c4d82638d4589d8eab7bc77f4135a6adc8c30f74959464a36bafe9463 WatchSource:0}: Error finding container 85666d6c4d82638d4589d8eab7bc77f4135a6adc8c30f74959464a36bafe9463: Status 404 returned error can't find the container with id 85666d6c4d82638d4589d8eab7bc77f4135a6adc8c30f74959464a36bafe9463 Apr 23 13:31:51.793758 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:31:51.793738 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7e52e35_9f8f_43be_b9d9_69181afa13ed.slice/crio-4f1e6baaa3cfb6769b4ac3ccbb3ca2b648afbc55aa09386e2b1dab7c3c117633 WatchSource:0}: Error finding container 4f1e6baaa3cfb6769b4ac3ccbb3ca2b648afbc55aa09386e2b1dab7c3c117633: Status 404 returned error can't find the container with id 4f1e6baaa3cfb6769b4ac3ccbb3ca2b648afbc55aa09386e2b1dab7c3c117633 Apr 23 13:31:52.118593 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:52.118261 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 13:26:50 +0000 UTC" deadline="2027-09-16 19:19:26.41490027 +0000 UTC" Apr 23 13:31:52.118593 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:52.118524 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12269h47m34.29638228s" Apr 23 13:31:52.156460 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:52.155697 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kjxcs" Apr 23 13:31:52.156460 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:52.155837 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kjxcs" podUID="2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515" Apr 23 13:31:52.165150 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:52.165010 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-275jz" event={"ID":"55a0340a-0400-482b-8422-3e0465f0802d","Type":"ContainerStarted","Data":"d7fe8a0ebb38b96981753ffa88e64d5120b73bdbdbddf849aacfc417fdfa0b71"} Apr 23 13:31:52.169395 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:52.169368 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jfnjq" event={"ID":"098e4208-e230-428a-af72-f1aa64c09ce0","Type":"ContainerStarted","Data":"b467c14cf44ee78804691b0d705cb869434fc8f1143e880dcad6f3d424687976"} Apr 23 13:31:52.172271 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:52.172217 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-cj9dc" event={"ID":"8ddfabc2-1040-4841-9473-ed5ba1c0c775","Type":"ContainerStarted","Data":"8408fa49bb3b899a8ff60c46db2d2ff2fd5a166f0cf8ae9e25d6871b1c908486"} Apr 23 13:31:52.175951 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:52.175928 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-66br2" event={"ID":"324be42f-87e9-413c-a39b-1c5ebac3ad6d","Type":"ContainerStarted","Data":"1a21429459c698450a9a3ee82ab0ea80e78a5959d02ea251800bbb876bbaccb3"} Apr 23 13:31:52.178053 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:52.178001 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-57l24" event={"ID":"dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd","Type":"ContainerStarted","Data":"8d0bcf773c04dd10d0af0d2ebc5b8683340e9ad88a39361e402ccd5e00426d48"} Apr 23 13:31:52.182233 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:52.182188 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-zzbbp" event={"ID":"d9ada073-20d9-454e-b803-aef6be2e17c7","Type":"ContainerStarted","Data":"5fe55517acc9de7ab37641bae52150a62a317babfa7b6467f082732f048a1f4c"} Apr 23 13:31:52.184962 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:52.184935 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-62dc8" event={"ID":"349fe628-9a5e-4f45-bd89-4d75157af516","Type":"ContainerStarted","Data":"7917b8063b21125ef827d8d69995037a90d48820a38be00514369c998bee7db1"} Apr 23 13:31:52.187087 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:52.187059 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-22.ec2.internal" event={"ID":"1b90ee820fd4186f1e6cd40d24ef3276","Type":"ContainerStarted","Data":"54e73e66cafc7b45a946dd0b41c05fd6fa4b30d6e209acd8de019337bb999a06"} Apr 23 13:31:52.194500 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:52.194460 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-glcfn" event={"ID":"b7e52e35-9f8f-43be-b9d9-69181afa13ed","Type":"ContainerStarted","Data":"4f1e6baaa3cfb6769b4ac3ccbb3ca2b648afbc55aa09386e2b1dab7c3c117633"} Apr 23 13:31:52.204664 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:52.204617 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" event={"ID":"edb014da-0558-4a2a-9f98-bea52a2c723e","Type":"ContainerStarted","Data":"85666d6c4d82638d4589d8eab7bc77f4135a6adc8c30f74959464a36bafe9463"} Apr 23 13:31:52.685342 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:52.685295 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0dcb0e48-f774-49cd-8b04-58a5050a5ff2-metrics-certs\") pod \"network-metrics-daemon-564gb\" (UID: \"0dcb0e48-f774-49cd-8b04-58a5050a5ff2\") " pod="openshift-multus/network-metrics-daemon-564gb" Apr 23 13:31:52.685538 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:52.685426 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515-original-pull-secret\") pod \"global-pull-secret-syncer-kjxcs\" (UID: \"2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515\") " pod="kube-system/global-pull-secret-syncer-kjxcs" Apr 23 13:31:52.685607 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:52.685553 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:52.685664 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:52.685617 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515-original-pull-secret podName:2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:54.68559986 +0000 UTC m=+6.144059126 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515-original-pull-secret") pod "global-pull-secret-syncer-kjxcs" (UID: "2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515") : object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:52.686095 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:52.686058 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:52.686190 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:52.686119 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0dcb0e48-f774-49cd-8b04-58a5050a5ff2-metrics-certs podName:0dcb0e48-f774-49cd-8b04-58a5050a5ff2 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:54.686103937 +0000 UTC m=+6.144563201 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0dcb0e48-f774-49cd-8b04-58a5050a5ff2-metrics-certs") pod "network-metrics-daemon-564gb" (UID: "0dcb0e48-f774-49cd-8b04-58a5050a5ff2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:52.786267 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:52.786228 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5hblj\" (UniqueName: \"kubernetes.io/projected/ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e-kube-api-access-5hblj\") pod \"network-check-target-dggd8\" (UID: \"ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e\") " pod="openshift-network-diagnostics/network-check-target-dggd8" Apr 23 13:31:52.786478 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:52.786459 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:31:52.786564 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:52.786484 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:31:52.786564 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:52.786496 2576 projected.go:194] Error preparing data for projected volume kube-api-access-5hblj for pod openshift-network-diagnostics/network-check-target-dggd8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:52.786564 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:52.786559 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e-kube-api-access-5hblj podName:ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e nodeName:}" failed. No retries permitted until 2026-04-23 13:31:54.786539976 +0000 UTC m=+6.244999260 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-5hblj" (UniqueName: "kubernetes.io/projected/ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e-kube-api-access-5hblj") pod "network-check-target-dggd8" (UID: "ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:53.156498 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:53.156462 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-564gb" Apr 23 13:31:53.156917 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:53.156606 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-564gb" podUID="0dcb0e48-f774-49cd-8b04-58a5050a5ff2" Apr 23 13:31:53.156917 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:53.156642 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dggd8" Apr 23 13:31:53.156917 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:53.156723 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dggd8" podUID="ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e" Apr 23 13:31:53.214439 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:53.214379 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal" event={"ID":"f546dccbfe88d958c8bad79dd015e11c","Type":"ContainerStarted","Data":"d1d243dce1567e7a4f23f17ede8a747f312b7ae3fd648e56355f162826256bc6"} Apr 23 13:31:53.227437 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:53.227367 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-22.ec2.internal" podStartSLOduration=3.227341398 podStartE2EDuration="3.227341398s" podCreationTimestamp="2026-04-23 13:31:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:31:52.203303512 +0000 UTC m=+3.661762787" watchObservedRunningTime="2026-04-23 13:31:53.227341398 +0000 UTC m=+4.685800674" Apr 23 13:31:54.155338 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:54.155301 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kjxcs" Apr 23 13:31:54.155549 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:54.155463 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kjxcs" podUID="2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515" Apr 23 13:31:54.226055 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:54.226020 2576 generic.go:358] "Generic (PLEG): container finished" podID="f546dccbfe88d958c8bad79dd015e11c" containerID="d1d243dce1567e7a4f23f17ede8a747f312b7ae3fd648e56355f162826256bc6" exitCode=0 Apr 23 13:31:54.226493 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:54.226081 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal" event={"ID":"f546dccbfe88d958c8bad79dd015e11c","Type":"ContainerDied","Data":"d1d243dce1567e7a4f23f17ede8a747f312b7ae3fd648e56355f162826256bc6"} Apr 23 13:31:54.702740 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:54.702700 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515-original-pull-secret\") pod \"global-pull-secret-syncer-kjxcs\" (UID: \"2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515\") " pod="kube-system/global-pull-secret-syncer-kjxcs" Apr 23 13:31:54.702925 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:54.702762 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0dcb0e48-f774-49cd-8b04-58a5050a5ff2-metrics-certs\") pod \"network-metrics-daemon-564gb\" (UID: \"0dcb0e48-f774-49cd-8b04-58a5050a5ff2\") " pod="openshift-multus/network-metrics-daemon-564gb" Apr 23 13:31:54.702925 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:54.702873 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:54.702925 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:54.702905 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:54.703084 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:54.702946 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515-original-pull-secret podName:2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:58.702926436 +0000 UTC m=+10.161385696 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515-original-pull-secret") pod "global-pull-secret-syncer-kjxcs" (UID: "2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515") : object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:54.703084 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:54.702968 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0dcb0e48-f774-49cd-8b04-58a5050a5ff2-metrics-certs podName:0dcb0e48-f774-49cd-8b04-58a5050a5ff2 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:58.70295764 +0000 UTC m=+10.161416892 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0dcb0e48-f774-49cd-8b04-58a5050a5ff2-metrics-certs") pod "network-metrics-daemon-564gb" (UID: "0dcb0e48-f774-49cd-8b04-58a5050a5ff2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:54.804308 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:54.804270 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5hblj\" (UniqueName: \"kubernetes.io/projected/ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e-kube-api-access-5hblj\") pod \"network-check-target-dggd8\" (UID: \"ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e\") " pod="openshift-network-diagnostics/network-check-target-dggd8" Apr 23 13:31:54.804526 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:54.804448 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:31:54.804526 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:54.804471 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:31:54.804526 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:54.804484 2576 projected.go:194] Error preparing data for projected volume kube-api-access-5hblj for pod openshift-network-diagnostics/network-check-target-dggd8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:54.804687 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:54.804547 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e-kube-api-access-5hblj podName:ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e nodeName:}" failed. No retries permitted until 2026-04-23 13:31:58.804527954 +0000 UTC m=+10.262987214 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-5hblj" (UniqueName: "kubernetes.io/projected/ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e-kube-api-access-5hblj") pod "network-check-target-dggd8" (UID: "ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:55.156507 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:55.156406 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dggd8" Apr 23 13:31:55.156507 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:55.156474 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-564gb" Apr 23 13:31:55.156734 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:55.156573 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dggd8" podUID="ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e" Apr 23 13:31:55.156734 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:55.156716 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-564gb" podUID="0dcb0e48-f774-49cd-8b04-58a5050a5ff2" Apr 23 13:31:56.156018 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:56.155978 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kjxcs" Apr 23 13:31:56.156476 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:56.156125 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kjxcs" podUID="2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515" Apr 23 13:31:57.155958 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:57.155927 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dggd8" Apr 23 13:31:57.156148 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:57.155930 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-564gb" Apr 23 13:31:57.156148 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:57.156075 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dggd8" podUID="ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e" Apr 23 13:31:57.156599 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:57.156159 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-564gb" podUID="0dcb0e48-f774-49cd-8b04-58a5050a5ff2" Apr 23 13:31:58.155946 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:58.155479 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kjxcs" Apr 23 13:31:58.155946 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:58.155614 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kjxcs" podUID="2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515" Apr 23 13:31:58.737695 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:58.737659 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515-original-pull-secret\") pod \"global-pull-secret-syncer-kjxcs\" (UID: \"2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515\") " pod="kube-system/global-pull-secret-syncer-kjxcs" Apr 23 13:31:58.738136 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:58.737717 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0dcb0e48-f774-49cd-8b04-58a5050a5ff2-metrics-certs\") pod \"network-metrics-daemon-564gb\" (UID: \"0dcb0e48-f774-49cd-8b04-58a5050a5ff2\") " pod="openshift-multus/network-metrics-daemon-564gb" Apr 23 13:31:58.738136 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:58.737824 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:58.738136 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:58.737892 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0dcb0e48-f774-49cd-8b04-58a5050a5ff2-metrics-certs podName:0dcb0e48-f774-49cd-8b04-58a5050a5ff2 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:06.737871489 +0000 UTC m=+18.196330755 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0dcb0e48-f774-49cd-8b04-58a5050a5ff2-metrics-certs") pod "network-metrics-daemon-564gb" (UID: "0dcb0e48-f774-49cd-8b04-58a5050a5ff2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:58.738136 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:58.737824 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:58.738136 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:58.737951 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515-original-pull-secret podName:2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:06.737936711 +0000 UTC m=+18.196395966 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515-original-pull-secret") pod "global-pull-secret-syncer-kjxcs" (UID: "2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515") : object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:58.838844 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:58.838760 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5hblj\" (UniqueName: \"kubernetes.io/projected/ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e-kube-api-access-5hblj\") pod \"network-check-target-dggd8\" (UID: \"ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e\") " pod="openshift-network-diagnostics/network-check-target-dggd8" Apr 23 13:31:58.839003 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:58.838940 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:31:58.839003 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:58.838965 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:31:58.839003 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:58.838980 2576 projected.go:194] Error preparing data for projected volume kube-api-access-5hblj for pod openshift-network-diagnostics/network-check-target-dggd8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:58.839168 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:58.839035 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e-kube-api-access-5hblj podName:ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e nodeName:}" failed. No retries permitted until 2026-04-23 13:32:06.83901747 +0000 UTC m=+18.297476736 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-5hblj" (UniqueName: "kubernetes.io/projected/ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e-kube-api-access-5hblj") pod "network-check-target-dggd8" (UID: "ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:59.156571 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:59.156541 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-564gb" Apr 23 13:31:59.156571 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:31:59.156579 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dggd8" Apr 23 13:31:59.156799 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:59.156731 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-564gb" podUID="0dcb0e48-f774-49cd-8b04-58a5050a5ff2" Apr 23 13:31:59.157180 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:31:59.157128 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dggd8" podUID="ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e" Apr 23 13:32:00.156115 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:00.155935 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kjxcs" Apr 23 13:32:00.156115 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:00.156070 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kjxcs" podUID="2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515" Apr 23 13:32:01.155695 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:01.155613 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dggd8" Apr 23 13:32:01.155844 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:01.155628 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-564gb" Apr 23 13:32:01.155844 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:01.155769 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dggd8" podUID="ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e" Apr 23 13:32:01.155961 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:01.155873 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-564gb" podUID="0dcb0e48-f774-49cd-8b04-58a5050a5ff2" Apr 23 13:32:02.155826 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:02.155787 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kjxcs" Apr 23 13:32:02.156228 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:02.155917 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kjxcs" podUID="2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515" Apr 23 13:32:03.155883 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:03.155843 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dggd8" Apr 23 13:32:03.156353 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:03.155976 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dggd8" podUID="ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e" Apr 23 13:32:03.156353 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:03.156023 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-564gb" Apr 23 13:32:03.156353 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:03.156140 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-564gb" podUID="0dcb0e48-f774-49cd-8b04-58a5050a5ff2" Apr 23 13:32:04.156145 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:04.156118 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kjxcs" Apr 23 13:32:04.156520 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:04.156223 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kjxcs" podUID="2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515" Apr 23 13:32:05.155971 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:05.155872 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-564gb" Apr 23 13:32:05.155971 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:05.155912 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dggd8" Apr 23 13:32:05.156223 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:05.156038 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-564gb" podUID="0dcb0e48-f774-49cd-8b04-58a5050a5ff2" Apr 23 13:32:05.156223 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:05.156167 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dggd8" podUID="ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e" Apr 23 13:32:06.155685 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:06.155653 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kjxcs" Apr 23 13:32:06.155852 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:06.155776 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kjxcs" podUID="2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515" Apr 23 13:32:06.798966 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:06.798930 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515-original-pull-secret\") pod \"global-pull-secret-syncer-kjxcs\" (UID: \"2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515\") " pod="kube-system/global-pull-secret-syncer-kjxcs" Apr 23 13:32:06.798966 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:06.798975 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0dcb0e48-f774-49cd-8b04-58a5050a5ff2-metrics-certs\") pod \"network-metrics-daemon-564gb\" (UID: \"0dcb0e48-f774-49cd-8b04-58a5050a5ff2\") " pod="openshift-multus/network-metrics-daemon-564gb" Apr 23 13:32:06.799635 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:06.799092 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:06.799635 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:06.799096 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 13:32:06.799635 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:06.799150 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0dcb0e48-f774-49cd-8b04-58a5050a5ff2-metrics-certs podName:0dcb0e48-f774-49cd-8b04-58a5050a5ff2 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:22.7991326 +0000 UTC m=+34.257591853 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0dcb0e48-f774-49cd-8b04-58a5050a5ff2-metrics-certs") pod "network-metrics-daemon-564gb" (UID: "0dcb0e48-f774-49cd-8b04-58a5050a5ff2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:06.799635 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:06.799168 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515-original-pull-secret podName:2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:22.799159477 +0000 UTC m=+34.257618735 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515-original-pull-secret") pod "global-pull-secret-syncer-kjxcs" (UID: "2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515") : object "kube-system"/"original-pull-secret" not registered Apr 23 13:32:06.899998 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:06.899965 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5hblj\" (UniqueName: \"kubernetes.io/projected/ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e-kube-api-access-5hblj\") pod \"network-check-target-dggd8\" (UID: \"ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e\") " pod="openshift-network-diagnostics/network-check-target-dggd8" Apr 23 13:32:06.900182 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:06.900144 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:32:06.900182 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:06.900168 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:32:06.900182 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:06.900179 2576 projected.go:194] Error preparing data for projected volume kube-api-access-5hblj for pod openshift-network-diagnostics/network-check-target-dggd8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:06.900342 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:06.900244 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e-kube-api-access-5hblj podName:ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e nodeName:}" failed. No retries permitted until 2026-04-23 13:32:22.900224351 +0000 UTC m=+34.358683629 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-5hblj" (UniqueName: "kubernetes.io/projected/ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e-kube-api-access-5hblj") pod "network-check-target-dggd8" (UID: "ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:07.155941 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:07.155850 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dggd8" Apr 23 13:32:07.156107 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:07.155973 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dggd8" podUID="ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e" Apr 23 13:32:07.156107 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:07.156034 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-564gb" Apr 23 13:32:07.156217 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:07.156135 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-564gb" podUID="0dcb0e48-f774-49cd-8b04-58a5050a5ff2" Apr 23 13:32:08.155667 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:08.155632 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kjxcs" Apr 23 13:32:08.156154 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:08.155777 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kjxcs" podUID="2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515" Apr 23 13:32:09.155830 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:09.155806 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dggd8" Apr 23 13:32:09.156520 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:09.155883 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dggd8" podUID="ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e" Apr 23 13:32:09.156520 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:09.155965 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-564gb" Apr 23 13:32:09.156520 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:09.156051 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-564gb" podUID="0dcb0e48-f774-49cd-8b04-58a5050a5ff2" Apr 23 13:32:09.259048 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:09.258839 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-72hmc_edb014da-0558-4a2a-9f98-bea52a2c723e/ovn-acl-logging/0.log" Apr 23 13:32:09.259456 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:09.259405 2576 generic.go:358] "Generic (PLEG): container finished" podID="edb014da-0558-4a2a-9f98-bea52a2c723e" containerID="c7b40b55163f715da9fd5b381b2c95a354fa2baf5a3c0f1e156c6f59645f1568" exitCode=1 Apr 23 13:32:09.259552 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:09.259448 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" event={"ID":"edb014da-0558-4a2a-9f98-bea52a2c723e","Type":"ContainerStarted","Data":"0d2ba279108941fbe6c421d4b659835410f6f64448f301683ed4c8ed6a07b93d"} Apr 23 13:32:09.259552 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:09.259486 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" event={"ID":"edb014da-0558-4a2a-9f98-bea52a2c723e","Type":"ContainerStarted","Data":"f615152b0319d96db539ab67aac47c3d2b92374aac4928d8c19d255f062eaca7"} Apr 23 13:32:09.259552 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:09.259500 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" event={"ID":"edb014da-0558-4a2a-9f98-bea52a2c723e","Type":"ContainerDied","Data":"c7b40b55163f715da9fd5b381b2c95a354fa2baf5a3c0f1e156c6f59645f1568"} Apr 23 13:32:09.259552 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:09.259512 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" event={"ID":"edb014da-0558-4a2a-9f98-bea52a2c723e","Type":"ContainerStarted","Data":"03e53d00c6eb54eb97272d334659ee29e8989a4774660f504244c11d79e2bf33"} Apr 23 13:32:09.260942 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:09.260919 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-275jz" event={"ID":"55a0340a-0400-482b-8422-3e0465f0802d","Type":"ContainerStarted","Data":"691c762ac4567cf6f60113a1e6bf6177930108785edb93a4329e3ade2c61da9f"} Apr 23 13:32:09.262453 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:09.262426 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jfnjq" event={"ID":"098e4208-e230-428a-af72-f1aa64c09ce0","Type":"ContainerStarted","Data":"549a68f8ea9e852e3abfaf4dd05f0591b96fa5e9632614866df0a5a84eb34da5"} Apr 23 13:32:09.263884 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:09.263833 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-cj9dc" event={"ID":"8ddfabc2-1040-4841-9473-ed5ba1c0c775","Type":"ContainerStarted","Data":"2dc08807a30247237953adb846c56bdc00059f2bfac32472b5db5cdb885a48b0"} Apr 23 13:32:09.265424 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:09.265389 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-57l24" event={"ID":"dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd","Type":"ContainerStarted","Data":"8d50cdc89e4afffed61aa9d727945cb9acc014e7c2912d7967d0f5c2edb517a8"} Apr 23 13:32:09.266749 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:09.266724 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-zzbbp" event={"ID":"d9ada073-20d9-454e-b803-aef6be2e17c7","Type":"ContainerStarted","Data":"550ea5ae93bdf34c833913202b886990915ca35411bf03316888cc57faf3cc03"} Apr 23 13:32:09.268321 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:09.268291 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-62dc8" event={"ID":"349fe628-9a5e-4f45-bd89-4d75157af516","Type":"ContainerStarted","Data":"4f46db054b15256149ce797dddb3227e5788b83fb750e75c4a5295aed035bc9f"} Apr 23 13:32:09.270165 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:09.270132 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal" event={"ID":"f546dccbfe88d958c8bad79dd015e11c","Type":"ContainerStarted","Data":"4e8f7524a4915a27551a9b0a4168cdffcc481b76a79f948bd8d347e65a75b252"} Apr 23 13:32:09.271489 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:09.271469 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-glcfn" event={"ID":"b7e52e35-9f8f-43be-b9d9-69181afa13ed","Type":"ContainerStarted","Data":"67dd49d8e65b3c8962be8d48c8209e9e46113b07cf6f91336ec426bf714a8da4"} Apr 23 13:32:09.277887 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:09.277850 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-275jz" podStartSLOduration=3.437470413 podStartE2EDuration="20.27783917s" podCreationTimestamp="2026-04-23 13:31:49 +0000 UTC" firstStartedPulling="2026-04-23 13:31:51.793280264 +0000 UTC m=+3.251739519" lastFinishedPulling="2026-04-23 13:32:08.633649017 +0000 UTC m=+20.092108276" observedRunningTime="2026-04-23 13:32:09.277465824 +0000 UTC m=+20.735925097" watchObservedRunningTime="2026-04-23 13:32:09.27783917 +0000 UTC m=+20.736298444" Apr 23 13:32:09.306444 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:09.306383 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal" podStartSLOduration=19.306368325 podStartE2EDuration="19.306368325s" podCreationTimestamp="2026-04-23 13:31:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:32:09.30582903 +0000 UTC m=+20.764288303" watchObservedRunningTime="2026-04-23 13:32:09.306368325 +0000 UTC m=+20.764827599" Apr 23 13:32:09.317737 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:09.317687 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-62dc8" podStartSLOduration=11.36713654 podStartE2EDuration="20.317673092s" podCreationTimestamp="2026-04-23 13:31:49 +0000 UTC" firstStartedPulling="2026-04-23 13:31:51.78294552 +0000 UTC m=+3.241404772" lastFinishedPulling="2026-04-23 13:32:00.733482071 +0000 UTC m=+12.191941324" observedRunningTime="2026-04-23 13:32:09.317355417 +0000 UTC m=+20.775814691" watchObservedRunningTime="2026-04-23 13:32:09.317673092 +0000 UTC m=+20.776132367" Apr 23 13:32:09.328305 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:09.328264 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-jfnjq" podStartSLOduration=3.6661453489999998 podStartE2EDuration="20.328250174s" podCreationTimestamp="2026-04-23 13:31:49 +0000 UTC" firstStartedPulling="2026-04-23 13:31:51.789063038 +0000 UTC m=+3.247522293" lastFinishedPulling="2026-04-23 13:32:08.451167866 +0000 UTC m=+19.909627118" observedRunningTime="2026-04-23 13:32:09.327993442 +0000 UTC m=+20.786452718" watchObservedRunningTime="2026-04-23 13:32:09.328250174 +0000 UTC m=+20.786709502" Apr 23 13:32:09.342753 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:09.342707 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-zzbbp" podStartSLOduration=3.503541022 podStartE2EDuration="20.342690255s" podCreationTimestamp="2026-04-23 13:31:49 +0000 UTC" firstStartedPulling="2026-04-23 13:31:51.786301252 +0000 UTC m=+3.244760507" lastFinishedPulling="2026-04-23 13:32:08.625450478 +0000 UTC m=+20.083909740" observedRunningTime="2026-04-23 13:32:09.342394344 +0000 UTC m=+20.800853617" watchObservedRunningTime="2026-04-23 13:32:09.342690255 +0000 UTC m=+20.801149530" Apr 23 13:32:09.357302 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:09.357263 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-cj9dc" podStartSLOduration=3.546405468 podStartE2EDuration="20.357248865s" podCreationTimestamp="2026-04-23 13:31:49 +0000 UTC" firstStartedPulling="2026-04-23 13:31:51.785289562 +0000 UTC m=+3.243748815" lastFinishedPulling="2026-04-23 13:32:08.596132947 +0000 UTC m=+20.054592212" observedRunningTime="2026-04-23 13:32:09.35698085 +0000 UTC m=+20.815440123" watchObservedRunningTime="2026-04-23 13:32:09.357248865 +0000 UTC m=+20.815708139" Apr 23 13:32:09.360109 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:09.360082 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-62dc8" Apr 23 13:32:09.360617 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:09.360599 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-62dc8" Apr 23 13:32:10.156070 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:10.156038 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kjxcs" Apr 23 13:32:10.156503 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:10.156141 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kjxcs" podUID="2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515" Apr 23 13:32:10.275115 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:10.275086 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-66br2" event={"ID":"324be42f-87e9-413c-a39b-1c5ebac3ad6d","Type":"ContainerStarted","Data":"0ef71cfa812801a5e09741fb6a8c06789d7b84d24bf3395ab7f34d412d9fed8c"} Apr 23 13:32:10.276549 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:10.276518 2576 generic.go:358] "Generic (PLEG): container finished" podID="dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd" containerID="8d50cdc89e4afffed61aa9d727945cb9acc014e7c2912d7967d0f5c2edb517a8" exitCode=0 Apr 23 13:32:10.276658 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:10.276593 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-57l24" event={"ID":"dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd","Type":"ContainerDied","Data":"8d50cdc89e4afffed61aa9d727945cb9acc014e7c2912d7967d0f5c2edb517a8"} Apr 23 13:32:10.279350 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:10.279324 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-72hmc_edb014da-0558-4a2a-9f98-bea52a2c723e/ovn-acl-logging/0.log" Apr 23 13:32:10.279769 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:10.279694 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" event={"ID":"edb014da-0558-4a2a-9f98-bea52a2c723e","Type":"ContainerStarted","Data":"07ef96122dfb6101f98523e0037dbc81c19092ff693401e471986d741df3cc2e"} Apr 23 13:32:10.279769 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:10.279735 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" event={"ID":"edb014da-0558-4a2a-9f98-bea52a2c723e","Type":"ContainerStarted","Data":"c8ac206e9eec196d55b1b4f3e41662d094cf1fe06e5f500737ed37d47a3c1b30"} Apr 23 13:32:10.296151 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:10.296107 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-66br2" podStartSLOduration=4.488216384 podStartE2EDuration="21.296095193s" podCreationTimestamp="2026-04-23 13:31:49 +0000 UTC" firstStartedPulling="2026-04-23 13:31:51.784795638 +0000 UTC m=+3.243254904" lastFinishedPulling="2026-04-23 13:32:08.592674447 +0000 UTC m=+20.051133713" observedRunningTime="2026-04-23 13:32:10.295407105 +0000 UTC m=+21.753866380" watchObservedRunningTime="2026-04-23 13:32:10.296095193 +0000 UTC m=+21.754554467" Apr 23 13:32:10.440481 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:10.440396 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 13:32:11.134349 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:11.134245 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T13:32:10.440494788Z","UUID":"304901e4-dfeb-4699-b52c-70deab6e73d5","Handler":null,"Name":"","Endpoint":""} Apr 23 13:32:11.137009 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:11.136964 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 13:32:11.137009 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:11.136999 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 13:32:11.155542 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:11.155517 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dggd8" Apr 23 13:32:11.155686 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:11.155627 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dggd8" podUID="ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e" Apr 23 13:32:11.155763 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:11.155730 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-564gb" Apr 23 13:32:11.155879 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:11.155855 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-564gb" podUID="0dcb0e48-f774-49cd-8b04-58a5050a5ff2" Apr 23 13:32:11.283916 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:11.283866 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-glcfn" event={"ID":"b7e52e35-9f8f-43be-b9d9-69181afa13ed","Type":"ContainerStarted","Data":"c49bf7d7b1afd2212d4789950a36f9d7a3a6b683e5df7979183599812b51a74b"} Apr 23 13:32:11.284279 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:11.283928 2576 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 23 13:32:12.156151 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:12.155966 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kjxcs" Apr 23 13:32:12.156353 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:12.156235 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kjxcs" podUID="2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515" Apr 23 13:32:12.287647 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:12.287611 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-glcfn" event={"ID":"b7e52e35-9f8f-43be-b9d9-69181afa13ed","Type":"ContainerStarted","Data":"3e4527b3e97ff7daaef8c02d55580d8a7fa438090c53a80ce4a15cf685cb5291"} Apr 23 13:32:12.290044 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:12.290025 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-72hmc_edb014da-0558-4a2a-9f98-bea52a2c723e/ovn-acl-logging/0.log" Apr 23 13:32:12.290307 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:12.290289 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" event={"ID":"edb014da-0558-4a2a-9f98-bea52a2c723e","Type":"ContainerStarted","Data":"dd3567a896a530160dd113875e1a5ef01654e16db62314410dd543816e73c049"} Apr 23 13:32:12.304853 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:12.304807 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-glcfn" podStartSLOduration=3.554009775 podStartE2EDuration="23.304796564s" podCreationTimestamp="2026-04-23 13:31:49 +0000 UTC" firstStartedPulling="2026-04-23 13:31:51.795472877 +0000 UTC m=+3.253932130" lastFinishedPulling="2026-04-23 13:32:11.546259663 +0000 UTC m=+23.004718919" observedRunningTime="2026-04-23 13:32:12.304347589 +0000 UTC m=+23.762806853" watchObservedRunningTime="2026-04-23 13:32:12.304796564 +0000 UTC m=+23.763255880" Apr 23 13:32:13.156045 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:13.156010 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dggd8" Apr 23 13:32:13.156045 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:13.156046 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-564gb" Apr 23 13:32:13.156274 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:13.156128 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dggd8" podUID="ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e" Apr 23 13:32:13.156327 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:13.156265 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-564gb" podUID="0dcb0e48-f774-49cd-8b04-58a5050a5ff2" Apr 23 13:32:14.155678 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:14.155536 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kjxcs" Apr 23 13:32:14.156482 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:14.155751 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kjxcs" podUID="2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515" Apr 23 13:32:14.295069 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:14.295036 2576 generic.go:358] "Generic (PLEG): container finished" podID="dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd" containerID="93c066818e579dca70fdd7e7eb7ed76f5ca3da762fef4f85a5b638c6b8bde794" exitCode=0 Apr 23 13:32:14.295189 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:14.295116 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-57l24" event={"ID":"dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd","Type":"ContainerDied","Data":"93c066818e579dca70fdd7e7eb7ed76f5ca3da762fef4f85a5b638c6b8bde794"} Apr 23 13:32:14.298366 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:14.298346 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-72hmc_edb014da-0558-4a2a-9f98-bea52a2c723e/ovn-acl-logging/0.log" Apr 23 13:32:14.298716 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:14.298692 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" event={"ID":"edb014da-0558-4a2a-9f98-bea52a2c723e","Type":"ContainerStarted","Data":"898248d48b27af7cc1c21228a826c587ba62df7330dc9a134490e92d0e9d564b"} Apr 23 13:32:14.298969 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:14.298955 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:32:14.299041 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:14.298978 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:32:14.299164 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:14.299119 2576 scope.go:117] "RemoveContainer" containerID="c7b40b55163f715da9fd5b381b2c95a354fa2baf5a3c0f1e156c6f59645f1568" Apr 23 13:32:14.314002 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:14.313976 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:32:14.314104 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:14.314051 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:32:15.155808 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:15.155769 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dggd8" Apr 23 13:32:15.156183 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:15.155767 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-564gb" Apr 23 13:32:15.156183 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:15.155880 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dggd8" podUID="ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e" Apr 23 13:32:15.156183 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:15.155994 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-564gb" podUID="0dcb0e48-f774-49cd-8b04-58a5050a5ff2" Apr 23 13:32:15.305332 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:15.305194 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-72hmc_edb014da-0558-4a2a-9f98-bea52a2c723e/ovn-acl-logging/0.log" Apr 23 13:32:15.307059 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:15.306120 2576 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 23 13:32:15.307290 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:15.307229 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" event={"ID":"edb014da-0558-4a2a-9f98-bea52a2c723e","Type":"ContainerStarted","Data":"84badc5cd01a6f1a863b2554d5264b787779a827407673990c3cd92ec578f729"} Apr 23 13:32:15.309942 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:15.309908 2576 generic.go:358] "Generic (PLEG): container finished" podID="dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd" containerID="e67f66dee89233a95f9617be3d78ca362de36fbb4e1f98b296cb18c84a6898eb" exitCode=0 Apr 23 13:32:15.310085 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:15.309949 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-57l24" event={"ID":"dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd","Type":"ContainerDied","Data":"e67f66dee89233a95f9617be3d78ca362de36fbb4e1f98b296cb18c84a6898eb"} Apr 23 13:32:15.333248 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:15.333210 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" podStartSLOduration=9.443856755 podStartE2EDuration="26.333199398s" podCreationTimestamp="2026-04-23 13:31:49 +0000 UTC" firstStartedPulling="2026-04-23 13:31:51.794537858 +0000 UTC m=+3.252997110" lastFinishedPulling="2026-04-23 13:32:08.683880495 +0000 UTC m=+20.142339753" observedRunningTime="2026-04-23 13:32:15.33220129 +0000 UTC m=+26.790660564" watchObservedRunningTime="2026-04-23 13:32:15.333199398 +0000 UTC m=+26.791658671" Apr 23 13:32:15.962660 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:15.962611 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-kjxcs"] Apr 23 13:32:15.962795 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:15.962753 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kjxcs" Apr 23 13:32:15.962917 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:15.962862 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kjxcs" podUID="2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515" Apr 23 13:32:15.965792 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:15.965760 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-564gb"] Apr 23 13:32:15.965919 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:15.965863 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-564gb" Apr 23 13:32:15.965981 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:15.965965 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-564gb" podUID="0dcb0e48-f774-49cd-8b04-58a5050a5ff2" Apr 23 13:32:15.967593 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:15.967568 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-62dc8" Apr 23 13:32:15.967696 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:15.967678 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:32:15.967760 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:15.967705 2576 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 23 13:32:15.968219 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:15.968200 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-62dc8" Apr 23 13:32:15.969025 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:15.969005 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-dggd8"] Apr 23 13:32:15.969127 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:15.969074 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dggd8" Apr 23 13:32:15.969187 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:15.969140 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dggd8" podUID="ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e" Apr 23 13:32:16.314359 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:16.314117 2576 generic.go:358] "Generic (PLEG): container finished" podID="dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd" containerID="251c95b11bc84772704c8f5e5eb596fb38004a48bd0b54d77c824b9130496490" exitCode=0 Apr 23 13:32:16.314359 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:16.314227 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-57l24" event={"ID":"dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd","Type":"ContainerDied","Data":"251c95b11bc84772704c8f5e5eb596fb38004a48bd0b54d77c824b9130496490"} Apr 23 13:32:17.156044 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:17.155996 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-564gb" Apr 23 13:32:17.156044 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:17.156036 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dggd8" Apr 23 13:32:17.156274 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:17.156143 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-564gb" podUID="0dcb0e48-f774-49cd-8b04-58a5050a5ff2" Apr 23 13:32:17.156337 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:17.156265 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dggd8" podUID="ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e" Apr 23 13:32:18.155605 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:18.155577 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kjxcs" Apr 23 13:32:18.156021 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:18.155702 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kjxcs" podUID="2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515" Apr 23 13:32:19.156408 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:19.156227 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-564gb" Apr 23 13:32:19.156408 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:19.156240 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dggd8" Apr 23 13:32:19.156968 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:19.156539 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-564gb" podUID="0dcb0e48-f774-49cd-8b04-58a5050a5ff2" Apr 23 13:32:19.156968 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:19.156647 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dggd8" podUID="ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e" Apr 23 13:32:20.155926 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:20.155891 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kjxcs" Apr 23 13:32:20.156100 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:20.156015 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kjxcs" podUID="2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515" Apr 23 13:32:21.155996 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:21.155958 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dggd8" Apr 23 13:32:21.156639 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:21.156082 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dggd8" podUID="ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e" Apr 23 13:32:21.156639 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:21.156148 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-564gb" Apr 23 13:32:21.156639 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:21.156262 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-564gb" podUID="0dcb0e48-f774-49cd-8b04-58a5050a5ff2" Apr 23 13:32:21.796843 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:21.796768 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeReady" Apr 23 13:32:21.797002 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:21.796919 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 23 13:32:21.831701 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:21.831669 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5449456df7-4lgzh"] Apr 23 13:32:21.845515 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:21.845491 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-x77wr"] Apr 23 13:32:21.845647 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:21.845621 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5449456df7-4lgzh" Apr 23 13:32:21.847996 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:21.847958 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 23 13:32:21.848114 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:21.848040 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-nw8fz\"" Apr 23 13:32:21.848529 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:21.848515 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 23 13:32:21.848978 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:21.848934 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 23 13:32:21.853316 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:21.853282 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 23 13:32:21.863801 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:21.863775 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5449456df7-4lgzh"] Apr 23 13:32:21.863801 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:21.863801 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-x77wr"] Apr 23 13:32:21.863935 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:21.863883 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-x77wr" Apr 23 13:32:21.866149 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:21.866121 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 13:32:21.866262 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:21.866175 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 13:32:21.866262 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:21.866137 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-8bjdp\"" Apr 23 13:32:21.866262 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:21.866242 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 13:32:21.909812 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:21.909785 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/65f33a6a-b4d7-451f-a523-ddee419b31c8-ca-trust-extracted\") pod \"image-registry-5449456df7-4lgzh\" (UID: \"65f33a6a-b4d7-451f-a523-ddee419b31c8\") " pod="openshift-image-registry/image-registry-5449456df7-4lgzh" Apr 23 13:32:21.909923 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:21.909833 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/65f33a6a-b4d7-451f-a523-ddee419b31c8-installation-pull-secrets\") pod \"image-registry-5449456df7-4lgzh\" (UID: \"65f33a6a-b4d7-451f-a523-ddee419b31c8\") " pod="openshift-image-registry/image-registry-5449456df7-4lgzh" Apr 23 13:32:21.909923 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:21.909877 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/65f33a6a-b4d7-451f-a523-ddee419b31c8-registry-tls\") pod \"image-registry-5449456df7-4lgzh\" (UID: \"65f33a6a-b4d7-451f-a523-ddee419b31c8\") " pod="openshift-image-registry/image-registry-5449456df7-4lgzh" Apr 23 13:32:21.909923 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:21.909916 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/65f33a6a-b4d7-451f-a523-ddee419b31c8-trusted-ca\") pod \"image-registry-5449456df7-4lgzh\" (UID: \"65f33a6a-b4d7-451f-a523-ddee419b31c8\") " pod="openshift-image-registry/image-registry-5449456df7-4lgzh" Apr 23 13:32:21.910041 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:21.909933 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fln4h\" (UniqueName: \"kubernetes.io/projected/65f33a6a-b4d7-451f-a523-ddee419b31c8-kube-api-access-fln4h\") pod \"image-registry-5449456df7-4lgzh\" (UID: \"65f33a6a-b4d7-451f-a523-ddee419b31c8\") " pod="openshift-image-registry/image-registry-5449456df7-4lgzh" Apr 23 13:32:21.910041 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:21.909951 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/65f33a6a-b4d7-451f-a523-ddee419b31c8-registry-certificates\") pod \"image-registry-5449456df7-4lgzh\" (UID: \"65f33a6a-b4d7-451f-a523-ddee419b31c8\") " pod="openshift-image-registry/image-registry-5449456df7-4lgzh" Apr 23 13:32:21.910041 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:21.909968 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/65f33a6a-b4d7-451f-a523-ddee419b31c8-image-registry-private-configuration\") pod \"image-registry-5449456df7-4lgzh\" (UID: \"65f33a6a-b4d7-451f-a523-ddee419b31c8\") " pod="openshift-image-registry/image-registry-5449456df7-4lgzh" Apr 23 13:32:21.910041 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:21.909983 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/65f33a6a-b4d7-451f-a523-ddee419b31c8-bound-sa-token\") pod \"image-registry-5449456df7-4lgzh\" (UID: \"65f33a6a-b4d7-451f-a523-ddee419b31c8\") " pod="openshift-image-registry/image-registry-5449456df7-4lgzh" Apr 23 13:32:21.949571 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:21.949540 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-294bt"] Apr 23 13:32:21.962305 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:21.962282 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-294bt"] Apr 23 13:32:21.962432 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:21.962396 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-294bt" Apr 23 13:32:21.964585 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:21.964564 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 13:32:21.964782 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:21.964762 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vtx8h\"" Apr 23 13:32:21.964847 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:21.964810 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 13:32:22.011103 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:22.011080 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/65f33a6a-b4d7-451f-a523-ddee419b31c8-installation-pull-secrets\") pod \"image-registry-5449456df7-4lgzh\" (UID: \"65f33a6a-b4d7-451f-a523-ddee419b31c8\") " pod="openshift-image-registry/image-registry-5449456df7-4lgzh" Apr 23 13:32:22.011245 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:22.011127 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/65f33a6a-b4d7-451f-a523-ddee419b31c8-registry-tls\") pod \"image-registry-5449456df7-4lgzh\" (UID: \"65f33a6a-b4d7-451f-a523-ddee419b31c8\") " pod="openshift-image-registry/image-registry-5449456df7-4lgzh" Apr 23 13:32:22.011245 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:22.011148 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37a73ca9-a37a-469b-8043-50d6b6f5ae10-cert\") pod \"ingress-canary-x77wr\" (UID: \"37a73ca9-a37a-469b-8043-50d6b6f5ae10\") " pod="openshift-ingress-canary/ingress-canary-x77wr" Apr 23 13:32:22.011245 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:22.011184 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/65f33a6a-b4d7-451f-a523-ddee419b31c8-trusted-ca\") pod \"image-registry-5449456df7-4lgzh\" (UID: \"65f33a6a-b4d7-451f-a523-ddee419b31c8\") " pod="openshift-image-registry/image-registry-5449456df7-4lgzh" Apr 23 13:32:22.011245 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:22.011202 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fln4h\" (UniqueName: \"kubernetes.io/projected/65f33a6a-b4d7-451f-a523-ddee419b31c8-kube-api-access-fln4h\") pod \"image-registry-5449456df7-4lgzh\" (UID: \"65f33a6a-b4d7-451f-a523-ddee419b31c8\") " pod="openshift-image-registry/image-registry-5449456df7-4lgzh" Apr 23 13:32:22.011245 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:22.011219 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-285qn\" (UniqueName: \"kubernetes.io/projected/37a73ca9-a37a-469b-8043-50d6b6f5ae10-kube-api-access-285qn\") pod \"ingress-canary-x77wr\" (UID: \"37a73ca9-a37a-469b-8043-50d6b6f5ae10\") " pod="openshift-ingress-canary/ingress-canary-x77wr" Apr 23 13:32:22.011245 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:22.011244 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/65f33a6a-b4d7-451f-a523-ddee419b31c8-registry-certificates\") pod \"image-registry-5449456df7-4lgzh\" (UID: \"65f33a6a-b4d7-451f-a523-ddee419b31c8\") " pod="openshift-image-registry/image-registry-5449456df7-4lgzh" Apr 23 13:32:22.011586 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:22.011248 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 13:32:22.011586 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:22.011265 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5449456df7-4lgzh: secret "image-registry-tls" not found Apr 23 13:32:22.011586 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:22.011272 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/65f33a6a-b4d7-451f-a523-ddee419b31c8-image-registry-private-configuration\") pod \"image-registry-5449456df7-4lgzh\" (UID: \"65f33a6a-b4d7-451f-a523-ddee419b31c8\") " pod="openshift-image-registry/image-registry-5449456df7-4lgzh" Apr 23 13:32:22.011586 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:22.011299 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/65f33a6a-b4d7-451f-a523-ddee419b31c8-bound-sa-token\") pod \"image-registry-5449456df7-4lgzh\" (UID: \"65f33a6a-b4d7-451f-a523-ddee419b31c8\") " pod="openshift-image-registry/image-registry-5449456df7-4lgzh" Apr 23 13:32:22.011586 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:22.011347 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/65f33a6a-b4d7-451f-a523-ddee419b31c8-registry-tls podName:65f33a6a-b4d7-451f-a523-ddee419b31c8 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:22.511314823 +0000 UTC m=+33.969774089 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/65f33a6a-b4d7-451f-a523-ddee419b31c8-registry-tls") pod "image-registry-5449456df7-4lgzh" (UID: "65f33a6a-b4d7-451f-a523-ddee419b31c8") : secret "image-registry-tls" not found Apr 23 13:32:22.011586 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:22.011396 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/65f33a6a-b4d7-451f-a523-ddee419b31c8-ca-trust-extracted\") pod \"image-registry-5449456df7-4lgzh\" (UID: \"65f33a6a-b4d7-451f-a523-ddee419b31c8\") " pod="openshift-image-registry/image-registry-5449456df7-4lgzh" Apr 23 13:32:22.011877 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:22.011841 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/65f33a6a-b4d7-451f-a523-ddee419b31c8-ca-trust-extracted\") pod \"image-registry-5449456df7-4lgzh\" (UID: \"65f33a6a-b4d7-451f-a523-ddee419b31c8\") " pod="openshift-image-registry/image-registry-5449456df7-4lgzh" Apr 23 13:32:22.011930 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:22.011882 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/65f33a6a-b4d7-451f-a523-ddee419b31c8-registry-certificates\") pod \"image-registry-5449456df7-4lgzh\" (UID: \"65f33a6a-b4d7-451f-a523-ddee419b31c8\") " pod="openshift-image-registry/image-registry-5449456df7-4lgzh" Apr 23 13:32:22.012275 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:22.012255 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/65f33a6a-b4d7-451f-a523-ddee419b31c8-trusted-ca\") pod \"image-registry-5449456df7-4lgzh\" (UID: \"65f33a6a-b4d7-451f-a523-ddee419b31c8\") " pod="openshift-image-registry/image-registry-5449456df7-4lgzh" Apr 23 13:32:22.014983 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:22.014963 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/65f33a6a-b4d7-451f-a523-ddee419b31c8-installation-pull-secrets\") pod \"image-registry-5449456df7-4lgzh\" (UID: \"65f33a6a-b4d7-451f-a523-ddee419b31c8\") " pod="openshift-image-registry/image-registry-5449456df7-4lgzh" Apr 23 13:32:22.015063 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:22.014963 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/65f33a6a-b4d7-451f-a523-ddee419b31c8-image-registry-private-configuration\") pod \"image-registry-5449456df7-4lgzh\" (UID: \"65f33a6a-b4d7-451f-a523-ddee419b31c8\") " pod="openshift-image-registry/image-registry-5449456df7-4lgzh" Apr 23 13:32:22.019970 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:22.019949 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/65f33a6a-b4d7-451f-a523-ddee419b31c8-bound-sa-token\") pod \"image-registry-5449456df7-4lgzh\" (UID: \"65f33a6a-b4d7-451f-a523-ddee419b31c8\") " pod="openshift-image-registry/image-registry-5449456df7-4lgzh" Apr 23 13:32:22.020163 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:22.020138 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fln4h\" (UniqueName: \"kubernetes.io/projected/65f33a6a-b4d7-451f-a523-ddee419b31c8-kube-api-access-fln4h\") pod \"image-registry-5449456df7-4lgzh\" (UID: \"65f33a6a-b4d7-451f-a523-ddee419b31c8\") " pod="openshift-image-registry/image-registry-5449456df7-4lgzh" Apr 23 13:32:22.112300 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:22.112226 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-285qn\" (UniqueName: \"kubernetes.io/projected/37a73ca9-a37a-469b-8043-50d6b6f5ae10-kube-api-access-285qn\") pod \"ingress-canary-x77wr\" (UID: \"37a73ca9-a37a-469b-8043-50d6b6f5ae10\") " pod="openshift-ingress-canary/ingress-canary-x77wr" Apr 23 13:32:22.112300 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:22.112263 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/85ae1732-8725-4694-959f-e9e424548aca-tmp-dir\") pod \"dns-default-294bt\" (UID: \"85ae1732-8725-4694-959f-e9e424548aca\") " pod="openshift-dns/dns-default-294bt" Apr 23 13:32:22.112300 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:22.112287 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/85ae1732-8725-4694-959f-e9e424548aca-config-volume\") pod \"dns-default-294bt\" (UID: \"85ae1732-8725-4694-959f-e9e424548aca\") " pod="openshift-dns/dns-default-294bt" Apr 23 13:32:22.112527 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:22.112349 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmq5j\" (UniqueName: \"kubernetes.io/projected/85ae1732-8725-4694-959f-e9e424548aca-kube-api-access-vmq5j\") pod \"dns-default-294bt\" (UID: \"85ae1732-8725-4694-959f-e9e424548aca\") " pod="openshift-dns/dns-default-294bt" Apr 23 13:32:22.112527 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:22.112398 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37a73ca9-a37a-469b-8043-50d6b6f5ae10-cert\") pod \"ingress-canary-x77wr\" (UID: \"37a73ca9-a37a-469b-8043-50d6b6f5ae10\") " pod="openshift-ingress-canary/ingress-canary-x77wr" Apr 23 13:32:22.112527 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:22.112442 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/85ae1732-8725-4694-959f-e9e424548aca-metrics-tls\") pod \"dns-default-294bt\" (UID: \"85ae1732-8725-4694-959f-e9e424548aca\") " pod="openshift-dns/dns-default-294bt" Apr 23 13:32:22.112527 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:22.112509 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:32:22.112641 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:22.112572 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37a73ca9-a37a-469b-8043-50d6b6f5ae10-cert podName:37a73ca9-a37a-469b-8043-50d6b6f5ae10 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:22.612557752 +0000 UTC m=+34.071017009 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/37a73ca9-a37a-469b-8043-50d6b6f5ae10-cert") pod "ingress-canary-x77wr" (UID: "37a73ca9-a37a-469b-8043-50d6b6f5ae10") : secret "canary-serving-cert" not found Apr 23 13:32:22.122919 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:22.122893 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-285qn\" (UniqueName: \"kubernetes.io/projected/37a73ca9-a37a-469b-8043-50d6b6f5ae10-kube-api-access-285qn\") pod \"ingress-canary-x77wr\" (UID: \"37a73ca9-a37a-469b-8043-50d6b6f5ae10\") " pod="openshift-ingress-canary/ingress-canary-x77wr" Apr 23 13:32:22.156013 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:22.155987 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kjxcs" Apr 23 13:32:22.158960 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:22.158941 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 13:32:22.213693 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:22.213665 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/85ae1732-8725-4694-959f-e9e424548aca-config-volume\") pod \"dns-default-294bt\" (UID: \"85ae1732-8725-4694-959f-e9e424548aca\") " pod="openshift-dns/dns-default-294bt" Apr 23 13:32:22.213812 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:22.213742 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vmq5j\" (UniqueName: \"kubernetes.io/projected/85ae1732-8725-4694-959f-e9e424548aca-kube-api-access-vmq5j\") pod \"dns-default-294bt\" (UID: \"85ae1732-8725-4694-959f-e9e424548aca\") " pod="openshift-dns/dns-default-294bt" Apr 23 13:32:22.213871 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:22.213819 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/85ae1732-8725-4694-959f-e9e424548aca-metrics-tls\") pod \"dns-default-294bt\" (UID: \"85ae1732-8725-4694-959f-e9e424548aca\") " pod="openshift-dns/dns-default-294bt" Apr 23 13:32:22.213960 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:22.213940 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:32:22.214032 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:22.213967 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/85ae1732-8725-4694-959f-e9e424548aca-tmp-dir\") pod \"dns-default-294bt\" (UID: \"85ae1732-8725-4694-959f-e9e424548aca\") " pod="openshift-dns/dns-default-294bt" Apr 23 13:32:22.214032 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:22.214005 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85ae1732-8725-4694-959f-e9e424548aca-metrics-tls podName:85ae1732-8725-4694-959f-e9e424548aca nodeName:}" failed. No retries permitted until 2026-04-23 13:32:22.713984883 +0000 UTC m=+34.172444159 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/85ae1732-8725-4694-959f-e9e424548aca-metrics-tls") pod "dns-default-294bt" (UID: "85ae1732-8725-4694-959f-e9e424548aca") : secret "dns-default-metrics-tls" not found Apr 23 13:32:22.214259 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:22.214234 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/85ae1732-8725-4694-959f-e9e424548aca-tmp-dir\") pod \"dns-default-294bt\" (UID: \"85ae1732-8725-4694-959f-e9e424548aca\") " pod="openshift-dns/dns-default-294bt" Apr 23 13:32:22.214370 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:22.214267 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/85ae1732-8725-4694-959f-e9e424548aca-config-volume\") pod \"dns-default-294bt\" (UID: \"85ae1732-8725-4694-959f-e9e424548aca\") " pod="openshift-dns/dns-default-294bt" Apr 23 13:32:22.234537 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:22.234513 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmq5j\" (UniqueName: \"kubernetes.io/projected/85ae1732-8725-4694-959f-e9e424548aca-kube-api-access-vmq5j\") pod \"dns-default-294bt\" (UID: \"85ae1732-8725-4694-959f-e9e424548aca\") " pod="openshift-dns/dns-default-294bt" Apr 23 13:32:22.327970 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:22.327927 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-57l24" event={"ID":"dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd","Type":"ContainerStarted","Data":"e22d91ea073f862617e917c5f1fa1f372e7b8008fb29c5f924c652ca176410c8"} Apr 23 13:32:22.516504 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:22.516470 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/65f33a6a-b4d7-451f-a523-ddee419b31c8-registry-tls\") pod \"image-registry-5449456df7-4lgzh\" (UID: \"65f33a6a-b4d7-451f-a523-ddee419b31c8\") " pod="openshift-image-registry/image-registry-5449456df7-4lgzh" Apr 23 13:32:22.516637 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:22.516622 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 13:32:22.516689 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:22.516639 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5449456df7-4lgzh: secret "image-registry-tls" not found Apr 23 13:32:22.516727 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:22.516700 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/65f33a6a-b4d7-451f-a523-ddee419b31c8-registry-tls podName:65f33a6a-b4d7-451f-a523-ddee419b31c8 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:23.516675466 +0000 UTC m=+34.975134738 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/65f33a6a-b4d7-451f-a523-ddee419b31c8-registry-tls") pod "image-registry-5449456df7-4lgzh" (UID: "65f33a6a-b4d7-451f-a523-ddee419b31c8") : secret "image-registry-tls" not found Apr 23 13:32:22.617691 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:22.617654 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37a73ca9-a37a-469b-8043-50d6b6f5ae10-cert\") pod \"ingress-canary-x77wr\" (UID: \"37a73ca9-a37a-469b-8043-50d6b6f5ae10\") " pod="openshift-ingress-canary/ingress-canary-x77wr" Apr 23 13:32:22.617822 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:22.617805 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:32:22.617882 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:22.617871 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37a73ca9-a37a-469b-8043-50d6b6f5ae10-cert podName:37a73ca9-a37a-469b-8043-50d6b6f5ae10 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:23.617853013 +0000 UTC m=+35.076312284 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/37a73ca9-a37a-469b-8043-50d6b6f5ae10-cert") pod "ingress-canary-x77wr" (UID: "37a73ca9-a37a-469b-8043-50d6b6f5ae10") : secret "canary-serving-cert" not found Apr 23 13:32:22.718729 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:22.718669 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/85ae1732-8725-4694-959f-e9e424548aca-metrics-tls\") pod \"dns-default-294bt\" (UID: \"85ae1732-8725-4694-959f-e9e424548aca\") " pod="openshift-dns/dns-default-294bt" Apr 23 13:32:22.718822 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:22.718805 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:32:22.718874 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:22.718865 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85ae1732-8725-4694-959f-e9e424548aca-metrics-tls podName:85ae1732-8725-4694-959f-e9e424548aca nodeName:}" failed. No retries permitted until 2026-04-23 13:32:23.718851073 +0000 UTC m=+35.177310326 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/85ae1732-8725-4694-959f-e9e424548aca-metrics-tls") pod "dns-default-294bt" (UID: "85ae1732-8725-4694-959f-e9e424548aca") : secret "dns-default-metrics-tls" not found Apr 23 13:32:22.818967 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:22.818931 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0dcb0e48-f774-49cd-8b04-58a5050a5ff2-metrics-certs\") pod \"network-metrics-daemon-564gb\" (UID: \"0dcb0e48-f774-49cd-8b04-58a5050a5ff2\") " pod="openshift-multus/network-metrics-daemon-564gb" Apr 23 13:32:22.819115 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:22.819046 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515-original-pull-secret\") pod \"global-pull-secret-syncer-kjxcs\" (UID: \"2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515\") " pod="kube-system/global-pull-secret-syncer-kjxcs" Apr 23 13:32:22.819115 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:22.819069 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:22.819184 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:22.819134 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0dcb0e48-f774-49cd-8b04-58a5050a5ff2-metrics-certs podName:0dcb0e48-f774-49cd-8b04-58a5050a5ff2 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:54.819120047 +0000 UTC m=+66.277579299 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0dcb0e48-f774-49cd-8b04-58a5050a5ff2-metrics-certs") pod "network-metrics-daemon-564gb" (UID: "0dcb0e48-f774-49cd-8b04-58a5050a5ff2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:22.821275 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:22.821249 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515-original-pull-secret\") pod \"global-pull-secret-syncer-kjxcs\" (UID: \"2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515\") " pod="kube-system/global-pull-secret-syncer-kjxcs" Apr 23 13:32:22.919578 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:22.919544 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5hblj\" (UniqueName: \"kubernetes.io/projected/ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e-kube-api-access-5hblj\") pod \"network-check-target-dggd8\" (UID: \"ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e\") " pod="openshift-network-diagnostics/network-check-target-dggd8" Apr 23 13:32:22.919713 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:22.919694 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:32:22.919772 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:22.919720 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:32:22.919772 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:22.919731 2576 projected.go:194] Error preparing data for projected volume kube-api-access-5hblj for pod openshift-network-diagnostics/network-check-target-dggd8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:22.919838 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:22.919780 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e-kube-api-access-5hblj podName:ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e nodeName:}" failed. No retries permitted until 2026-04-23 13:32:54.919765225 +0000 UTC m=+66.378224477 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-5hblj" (UniqueName: "kubernetes.io/projected/ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e-kube-api-access-5hblj") pod "network-check-target-dggd8" (UID: "ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:23.064890 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:23.064864 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kjxcs" Apr 23 13:32:23.155392 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:23.155240 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dggd8" Apr 23 13:32:23.155554 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:23.155240 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-564gb" Apr 23 13:32:23.158649 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:23.158627 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-b4pgl\"" Apr 23 13:32:23.159105 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:23.158736 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 13:32:23.159105 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:23.158862 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 13:32:23.159105 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:23.158961 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 13:32:23.159105 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:23.159003 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-rl8cq\"" Apr 23 13:32:23.227237 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:23.227204 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-kjxcs"] Apr 23 13:32:23.231253 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:32:23.231222 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d75111d_f6a4_4ae1_8c0b_e20c6a9b9515.slice/crio-f8a9622ed5a06f4b638cad6a4309129810961b8477d7e19f62665c0d983d2372 WatchSource:0}: Error finding container f8a9622ed5a06f4b638cad6a4309129810961b8477d7e19f62665c0d983d2372: Status 404 returned error can't find the container with id f8a9622ed5a06f4b638cad6a4309129810961b8477d7e19f62665c0d983d2372 Apr 23 13:32:23.331780 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:23.331678 2576 generic.go:358] "Generic (PLEG): container finished" podID="dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd" containerID="e22d91ea073f862617e917c5f1fa1f372e7b8008fb29c5f924c652ca176410c8" exitCode=0 Apr 23 13:32:23.331780 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:23.331755 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-57l24" event={"ID":"dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd","Type":"ContainerDied","Data":"e22d91ea073f862617e917c5f1fa1f372e7b8008fb29c5f924c652ca176410c8"} Apr 23 13:32:23.332923 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:23.332897 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-kjxcs" event={"ID":"2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515","Type":"ContainerStarted","Data":"f8a9622ed5a06f4b638cad6a4309129810961b8477d7e19f62665c0d983d2372"} Apr 23 13:32:23.526369 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:23.526338 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/65f33a6a-b4d7-451f-a523-ddee419b31c8-registry-tls\") pod \"image-registry-5449456df7-4lgzh\" (UID: \"65f33a6a-b4d7-451f-a523-ddee419b31c8\") " pod="openshift-image-registry/image-registry-5449456df7-4lgzh" Apr 23 13:32:23.526543 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:23.526522 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 13:32:23.526621 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:23.526547 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5449456df7-4lgzh: secret "image-registry-tls" not found Apr 23 13:32:23.526675 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:23.526620 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/65f33a6a-b4d7-451f-a523-ddee419b31c8-registry-tls podName:65f33a6a-b4d7-451f-a523-ddee419b31c8 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:25.52659697 +0000 UTC m=+36.985056238 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/65f33a6a-b4d7-451f-a523-ddee419b31c8-registry-tls") pod "image-registry-5449456df7-4lgzh" (UID: "65f33a6a-b4d7-451f-a523-ddee419b31c8") : secret "image-registry-tls" not found Apr 23 13:32:23.627733 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:23.627645 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37a73ca9-a37a-469b-8043-50d6b6f5ae10-cert\") pod \"ingress-canary-x77wr\" (UID: \"37a73ca9-a37a-469b-8043-50d6b6f5ae10\") " pod="openshift-ingress-canary/ingress-canary-x77wr" Apr 23 13:32:23.627866 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:23.627782 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:32:23.627866 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:23.627846 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37a73ca9-a37a-469b-8043-50d6b6f5ae10-cert podName:37a73ca9-a37a-469b-8043-50d6b6f5ae10 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:25.627831311 +0000 UTC m=+37.086290563 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/37a73ca9-a37a-469b-8043-50d6b6f5ae10-cert") pod "ingress-canary-x77wr" (UID: "37a73ca9-a37a-469b-8043-50d6b6f5ae10") : secret "canary-serving-cert" not found Apr 23 13:32:23.728747 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:23.728714 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/85ae1732-8725-4694-959f-e9e424548aca-metrics-tls\") pod \"dns-default-294bt\" (UID: \"85ae1732-8725-4694-959f-e9e424548aca\") " pod="openshift-dns/dns-default-294bt" Apr 23 13:32:23.728914 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:23.728885 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:32:23.728977 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:23.728969 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85ae1732-8725-4694-959f-e9e424548aca-metrics-tls podName:85ae1732-8725-4694-959f-e9e424548aca nodeName:}" failed. No retries permitted until 2026-04-23 13:32:25.728945727 +0000 UTC m=+37.187404981 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/85ae1732-8725-4694-959f-e9e424548aca-metrics-tls") pod "dns-default-294bt" (UID: "85ae1732-8725-4694-959f-e9e424548aca") : secret "dns-default-metrics-tls" not found Apr 23 13:32:24.338587 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:24.338500 2576 generic.go:358] "Generic (PLEG): container finished" podID="dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd" containerID="3b6e1e0a84a001581d3e57b69941c216c146294c18b30fec40d31deb0b4e7d70" exitCode=0 Apr 23 13:32:24.339017 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:24.338590 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-57l24" event={"ID":"dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd","Type":"ContainerDied","Data":"3b6e1e0a84a001581d3e57b69941c216c146294c18b30fec40d31deb0b4e7d70"} Apr 23 13:32:25.343385 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:25.343351 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-57l24" event={"ID":"dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd","Type":"ContainerStarted","Data":"cb206d96b9a5cfe394b8baadf79e4dd28b8259eb3815c4545983db466d2a7896"} Apr 23 13:32:25.367051 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:25.366996 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-57l24" podStartSLOduration=5.991838281 podStartE2EDuration="36.366978875s" podCreationTimestamp="2026-04-23 13:31:49 +0000 UTC" firstStartedPulling="2026-04-23 13:31:51.792495923 +0000 UTC m=+3.250955176" lastFinishedPulling="2026-04-23 13:32:22.167636512 +0000 UTC m=+33.626095770" observedRunningTime="2026-04-23 13:32:25.366153823 +0000 UTC m=+36.824613098" watchObservedRunningTime="2026-04-23 13:32:25.366978875 +0000 UTC m=+36.825438140" Apr 23 13:32:25.543216 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:25.543182 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/65f33a6a-b4d7-451f-a523-ddee419b31c8-registry-tls\") pod \"image-registry-5449456df7-4lgzh\" (UID: \"65f33a6a-b4d7-451f-a523-ddee419b31c8\") " pod="openshift-image-registry/image-registry-5449456df7-4lgzh" Apr 23 13:32:25.543367 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:25.543348 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 13:32:25.543367 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:25.543364 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5449456df7-4lgzh: secret "image-registry-tls" not found Apr 23 13:32:25.543475 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:25.543430 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/65f33a6a-b4d7-451f-a523-ddee419b31c8-registry-tls podName:65f33a6a-b4d7-451f-a523-ddee419b31c8 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:29.543399537 +0000 UTC m=+41.001858793 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/65f33a6a-b4d7-451f-a523-ddee419b31c8-registry-tls") pod "image-registry-5449456df7-4lgzh" (UID: "65f33a6a-b4d7-451f-a523-ddee419b31c8") : secret "image-registry-tls" not found Apr 23 13:32:25.644157 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:25.644127 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37a73ca9-a37a-469b-8043-50d6b6f5ae10-cert\") pod \"ingress-canary-x77wr\" (UID: \"37a73ca9-a37a-469b-8043-50d6b6f5ae10\") " pod="openshift-ingress-canary/ingress-canary-x77wr" Apr 23 13:32:25.644293 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:25.644274 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:32:25.644348 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:25.644339 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37a73ca9-a37a-469b-8043-50d6b6f5ae10-cert podName:37a73ca9-a37a-469b-8043-50d6b6f5ae10 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:29.644325085 +0000 UTC m=+41.102784337 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/37a73ca9-a37a-469b-8043-50d6b6f5ae10-cert") pod "ingress-canary-x77wr" (UID: "37a73ca9-a37a-469b-8043-50d6b6f5ae10") : secret "canary-serving-cert" not found Apr 23 13:32:25.745252 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:25.745210 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/85ae1732-8725-4694-959f-e9e424548aca-metrics-tls\") pod \"dns-default-294bt\" (UID: \"85ae1732-8725-4694-959f-e9e424548aca\") " pod="openshift-dns/dns-default-294bt" Apr 23 13:32:25.745452 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:25.745393 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:32:25.745508 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:25.745493 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85ae1732-8725-4694-959f-e9e424548aca-metrics-tls podName:85ae1732-8725-4694-959f-e9e424548aca nodeName:}" failed. No retries permitted until 2026-04-23 13:32:29.745472516 +0000 UTC m=+41.203931771 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/85ae1732-8725-4694-959f-e9e424548aca-metrics-tls") pod "dns-default-294bt" (UID: "85ae1732-8725-4694-959f-e9e424548aca") : secret "dns-default-metrics-tls" not found Apr 23 13:32:29.353242 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:29.353206 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-kjxcs" event={"ID":"2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515","Type":"ContainerStarted","Data":"fec368b4a740183c46a62fb171374bfb0043e406e633307c5c942746f95b7c38"} Apr 23 13:32:29.367936 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:29.367887 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-kjxcs" podStartSLOduration=35.407645886 podStartE2EDuration="40.367873295s" podCreationTimestamp="2026-04-23 13:31:49 +0000 UTC" firstStartedPulling="2026-04-23 13:32:23.233171383 +0000 UTC m=+34.691630639" lastFinishedPulling="2026-04-23 13:32:28.193398793 +0000 UTC m=+39.651858048" observedRunningTime="2026-04-23 13:32:29.367036809 +0000 UTC m=+40.825496080" watchObservedRunningTime="2026-04-23 13:32:29.367873295 +0000 UTC m=+40.826332568" Apr 23 13:32:29.576316 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:29.576275 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/65f33a6a-b4d7-451f-a523-ddee419b31c8-registry-tls\") pod \"image-registry-5449456df7-4lgzh\" (UID: \"65f33a6a-b4d7-451f-a523-ddee419b31c8\") " pod="openshift-image-registry/image-registry-5449456df7-4lgzh" Apr 23 13:32:29.576505 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:29.576443 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 13:32:29.576505 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:29.576463 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5449456df7-4lgzh: secret "image-registry-tls" not found Apr 23 13:32:29.576577 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:29.576518 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/65f33a6a-b4d7-451f-a523-ddee419b31c8-registry-tls podName:65f33a6a-b4d7-451f-a523-ddee419b31c8 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:37.576501562 +0000 UTC m=+49.034960819 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/65f33a6a-b4d7-451f-a523-ddee419b31c8-registry-tls") pod "image-registry-5449456df7-4lgzh" (UID: "65f33a6a-b4d7-451f-a523-ddee419b31c8") : secret "image-registry-tls" not found Apr 23 13:32:29.677582 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:29.677543 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37a73ca9-a37a-469b-8043-50d6b6f5ae10-cert\") pod \"ingress-canary-x77wr\" (UID: \"37a73ca9-a37a-469b-8043-50d6b6f5ae10\") " pod="openshift-ingress-canary/ingress-canary-x77wr" Apr 23 13:32:29.677748 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:29.677670 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:32:29.677748 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:29.677736 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37a73ca9-a37a-469b-8043-50d6b6f5ae10-cert podName:37a73ca9-a37a-469b-8043-50d6b6f5ae10 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:37.677717793 +0000 UTC m=+49.136177046 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/37a73ca9-a37a-469b-8043-50d6b6f5ae10-cert") pod "ingress-canary-x77wr" (UID: "37a73ca9-a37a-469b-8043-50d6b6f5ae10") : secret "canary-serving-cert" not found Apr 23 13:32:29.778768 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:29.778733 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/85ae1732-8725-4694-959f-e9e424548aca-metrics-tls\") pod \"dns-default-294bt\" (UID: \"85ae1732-8725-4694-959f-e9e424548aca\") " pod="openshift-dns/dns-default-294bt" Apr 23 13:32:29.778910 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:29.778876 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:32:29.778957 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:29.778938 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85ae1732-8725-4694-959f-e9e424548aca-metrics-tls podName:85ae1732-8725-4694-959f-e9e424548aca nodeName:}" failed. No retries permitted until 2026-04-23 13:32:37.778922627 +0000 UTC m=+49.237381880 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/85ae1732-8725-4694-959f-e9e424548aca-metrics-tls") pod "dns-default-294bt" (UID: "85ae1732-8725-4694-959f-e9e424548aca") : secret "dns-default-metrics-tls" not found Apr 23 13:32:37.638954 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:37.638914 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/65f33a6a-b4d7-451f-a523-ddee419b31c8-registry-tls\") pod \"image-registry-5449456df7-4lgzh\" (UID: \"65f33a6a-b4d7-451f-a523-ddee419b31c8\") " pod="openshift-image-registry/image-registry-5449456df7-4lgzh" Apr 23 13:32:37.639524 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:37.639070 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 13:32:37.639524 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:37.639090 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5449456df7-4lgzh: secret "image-registry-tls" not found Apr 23 13:32:37.639524 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:37.639145 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/65f33a6a-b4d7-451f-a523-ddee419b31c8-registry-tls podName:65f33a6a-b4d7-451f-a523-ddee419b31c8 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:53.639130467 +0000 UTC m=+65.097589732 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/65f33a6a-b4d7-451f-a523-ddee419b31c8-registry-tls") pod "image-registry-5449456df7-4lgzh" (UID: "65f33a6a-b4d7-451f-a523-ddee419b31c8") : secret "image-registry-tls" not found Apr 23 13:32:37.739576 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:37.739529 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37a73ca9-a37a-469b-8043-50d6b6f5ae10-cert\") pod \"ingress-canary-x77wr\" (UID: \"37a73ca9-a37a-469b-8043-50d6b6f5ae10\") " pod="openshift-ingress-canary/ingress-canary-x77wr" Apr 23 13:32:37.739725 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:37.739672 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:32:37.739766 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:37.739733 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37a73ca9-a37a-469b-8043-50d6b6f5ae10-cert podName:37a73ca9-a37a-469b-8043-50d6b6f5ae10 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:53.739718115 +0000 UTC m=+65.198177367 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/37a73ca9-a37a-469b-8043-50d6b6f5ae10-cert") pod "ingress-canary-x77wr" (UID: "37a73ca9-a37a-469b-8043-50d6b6f5ae10") : secret "canary-serving-cert" not found Apr 23 13:32:37.840329 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:37.840287 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/85ae1732-8725-4694-959f-e9e424548aca-metrics-tls\") pod \"dns-default-294bt\" (UID: \"85ae1732-8725-4694-959f-e9e424548aca\") " pod="openshift-dns/dns-default-294bt" Apr 23 13:32:37.840497 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:37.840407 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:32:37.840497 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:37.840483 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85ae1732-8725-4694-959f-e9e424548aca-metrics-tls podName:85ae1732-8725-4694-959f-e9e424548aca nodeName:}" failed. No retries permitted until 2026-04-23 13:32:53.840469091 +0000 UTC m=+65.298928356 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/85ae1732-8725-4694-959f-e9e424548aca-metrics-tls") pod "dns-default-294bt" (UID: "85ae1732-8725-4694-959f-e9e424548aca") : secret "dns-default-metrics-tls" not found Apr 23 13:32:47.325960 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:47.325932 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-72hmc" Apr 23 13:32:53.644287 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:53.644233 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/65f33a6a-b4d7-451f-a523-ddee419b31c8-registry-tls\") pod \"image-registry-5449456df7-4lgzh\" (UID: \"65f33a6a-b4d7-451f-a523-ddee419b31c8\") " pod="openshift-image-registry/image-registry-5449456df7-4lgzh" Apr 23 13:32:53.644673 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:53.644382 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 13:32:53.644673 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:53.644403 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5449456df7-4lgzh: secret "image-registry-tls" not found Apr 23 13:32:53.644673 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:53.644498 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/65f33a6a-b4d7-451f-a523-ddee419b31c8-registry-tls podName:65f33a6a-b4d7-451f-a523-ddee419b31c8 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:25.644481356 +0000 UTC m=+97.102940621 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/65f33a6a-b4d7-451f-a523-ddee419b31c8-registry-tls") pod "image-registry-5449456df7-4lgzh" (UID: "65f33a6a-b4d7-451f-a523-ddee419b31c8") : secret "image-registry-tls" not found Apr 23 13:32:53.745019 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:53.744989 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37a73ca9-a37a-469b-8043-50d6b6f5ae10-cert\") pod \"ingress-canary-x77wr\" (UID: \"37a73ca9-a37a-469b-8043-50d6b6f5ae10\") " pod="openshift-ingress-canary/ingress-canary-x77wr" Apr 23 13:32:53.745175 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:53.745101 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:32:53.745175 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:53.745150 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37a73ca9-a37a-469b-8043-50d6b6f5ae10-cert podName:37a73ca9-a37a-469b-8043-50d6b6f5ae10 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:25.745137064 +0000 UTC m=+97.203596316 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/37a73ca9-a37a-469b-8043-50d6b6f5ae10-cert") pod "ingress-canary-x77wr" (UID: "37a73ca9-a37a-469b-8043-50d6b6f5ae10") : secret "canary-serving-cert" not found Apr 23 13:32:53.845919 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:53.845851 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/85ae1732-8725-4694-959f-e9e424548aca-metrics-tls\") pod \"dns-default-294bt\" (UID: \"85ae1732-8725-4694-959f-e9e424548aca\") " pod="openshift-dns/dns-default-294bt" Apr 23 13:32:53.846030 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:53.845962 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:32:53.846030 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:53.846009 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85ae1732-8725-4694-959f-e9e424548aca-metrics-tls podName:85ae1732-8725-4694-959f-e9e424548aca nodeName:}" failed. No retries permitted until 2026-04-23 13:33:25.845997485 +0000 UTC m=+97.304456737 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/85ae1732-8725-4694-959f-e9e424548aca-metrics-tls") pod "dns-default-294bt" (UID: "85ae1732-8725-4694-959f-e9e424548aca") : secret "dns-default-metrics-tls" not found Apr 23 13:32:54.853891 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:54.853841 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0dcb0e48-f774-49cd-8b04-58a5050a5ff2-metrics-certs\") pod \"network-metrics-daemon-564gb\" (UID: \"0dcb0e48-f774-49cd-8b04-58a5050a5ff2\") " pod="openshift-multus/network-metrics-daemon-564gb" Apr 23 13:32:54.856584 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:54.856566 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 13:32:54.864464 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:54.864447 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 13:32:54.864552 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:32:54.864497 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0dcb0e48-f774-49cd-8b04-58a5050a5ff2-metrics-certs podName:0dcb0e48-f774-49cd-8b04-58a5050a5ff2 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:58.864483755 +0000 UTC m=+130.322943007 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0dcb0e48-f774-49cd-8b04-58a5050a5ff2-metrics-certs") pod "network-metrics-daemon-564gb" (UID: "0dcb0e48-f774-49cd-8b04-58a5050a5ff2") : secret "metrics-daemon-secret" not found Apr 23 13:32:54.955022 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:54.954979 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5hblj\" (UniqueName: \"kubernetes.io/projected/ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e-kube-api-access-5hblj\") pod \"network-check-target-dggd8\" (UID: \"ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e\") " pod="openshift-network-diagnostics/network-check-target-dggd8" Apr 23 13:32:54.957674 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:54.957650 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 13:32:54.967695 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:54.967670 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 13:32:54.979406 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:54.979385 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hblj\" (UniqueName: \"kubernetes.io/projected/ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e-kube-api-access-5hblj\") pod \"network-check-target-dggd8\" (UID: \"ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e\") " pod="openshift-network-diagnostics/network-check-target-dggd8" Apr 23 13:32:55.272273 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:55.272242 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-b4pgl\"" Apr 23 13:32:55.280309 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:55.280290 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dggd8" Apr 23 13:32:55.411791 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:55.411762 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-dggd8"] Apr 23 13:32:55.415009 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:32:55.414979 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab515304_a26a_4e7e_bfeb_cc0ca3a93c8e.slice/crio-1b805e6213e395252239c869eebbaf555bd8459dbcc128ad0e5a2a4e04a224b0 WatchSource:0}: Error finding container 1b805e6213e395252239c869eebbaf555bd8459dbcc128ad0e5a2a4e04a224b0: Status 404 returned error can't find the container with id 1b805e6213e395252239c869eebbaf555bd8459dbcc128ad0e5a2a4e04a224b0 Apr 23 13:32:56.405258 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:56.405212 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-dggd8" event={"ID":"ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e","Type":"ContainerStarted","Data":"1b805e6213e395252239c869eebbaf555bd8459dbcc128ad0e5a2a4e04a224b0"} Apr 23 13:32:58.343631 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:58.343534 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-554c84b6f5-sgbxr"] Apr 23 13:32:58.346374 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:58.346346 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-554c84b6f5-sgbxr" Apr 23 13:32:58.348666 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:58.348643 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 23 13:32:58.348818 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:58.348799 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 23 13:32:58.349610 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:58.349590 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-6knvz\"" Apr 23 13:32:58.349719 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:58.349612 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 23 13:32:58.349719 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:58.349610 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 23 13:32:58.360802 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:58.360777 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-554c84b6f5-sgbxr"] Apr 23 13:32:58.384461 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:58.383429 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x26sf\" (UniqueName: \"kubernetes.io/projected/16f1aa89-3c8a-4d86-9b12-5db500c2ef23-kube-api-access-x26sf\") pod \"managed-serviceaccount-addon-agent-554c84b6f5-sgbxr\" (UID: \"16f1aa89-3c8a-4d86-9b12-5db500c2ef23\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-554c84b6f5-sgbxr" Apr 23 13:32:58.384461 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:58.383493 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/16f1aa89-3c8a-4d86-9b12-5db500c2ef23-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-554c84b6f5-sgbxr\" (UID: \"16f1aa89-3c8a-4d86-9b12-5db500c2ef23\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-554c84b6f5-sgbxr" Apr 23 13:32:58.400371 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:58.400343 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7fc49968d7-hc2br"] Apr 23 13:32:58.403171 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:58.403155 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7fc49968d7-hc2br" Apr 23 13:32:58.405847 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:58.405827 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 23 13:32:58.411868 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:58.411845 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-dggd8" event={"ID":"ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e","Type":"ContainerStarted","Data":"91a0e8f2ef435fa41c529e43ab31834423ac604f304876f662bb0981972d47ae"} Apr 23 13:32:58.411982 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:58.411962 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-dggd8" Apr 23 13:32:58.413609 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:58.413591 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7fc49968d7-hc2br"] Apr 23 13:32:58.446837 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:58.446787 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-dggd8" podStartSLOduration=66.841942446 podStartE2EDuration="1m9.446770749s" podCreationTimestamp="2026-04-23 13:31:49 +0000 UTC" firstStartedPulling="2026-04-23 13:32:55.416832281 +0000 UTC m=+66.875291533" lastFinishedPulling="2026-04-23 13:32:58.02166058 +0000 UTC m=+69.480119836" observedRunningTime="2026-04-23 13:32:58.446503926 +0000 UTC m=+69.904963200" watchObservedRunningTime="2026-04-23 13:32:58.446770749 +0000 UTC m=+69.905230016" Apr 23 13:32:58.484269 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:58.484235 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d7d1ff65-f2dc-4196-954f-af1ce00e7800-tmp\") pod \"klusterlet-addon-workmgr-7fc49968d7-hc2br\" (UID: \"d7d1ff65-f2dc-4196-954f-af1ce00e7800\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7fc49968d7-hc2br" Apr 23 13:32:58.484269 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:58.484275 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/16f1aa89-3c8a-4d86-9b12-5db500c2ef23-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-554c84b6f5-sgbxr\" (UID: \"16f1aa89-3c8a-4d86-9b12-5db500c2ef23\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-554c84b6f5-sgbxr" Apr 23 13:32:58.484516 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:58.484294 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgftl\" (UniqueName: \"kubernetes.io/projected/d7d1ff65-f2dc-4196-954f-af1ce00e7800-kube-api-access-rgftl\") pod \"klusterlet-addon-workmgr-7fc49968d7-hc2br\" (UID: \"d7d1ff65-f2dc-4196-954f-af1ce00e7800\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7fc49968d7-hc2br" Apr 23 13:32:58.484516 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:58.484502 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/d7d1ff65-f2dc-4196-954f-af1ce00e7800-klusterlet-config\") pod \"klusterlet-addon-workmgr-7fc49968d7-hc2br\" (UID: \"d7d1ff65-f2dc-4196-954f-af1ce00e7800\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7fc49968d7-hc2br" Apr 23 13:32:58.484607 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:58.484547 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x26sf\" (UniqueName: \"kubernetes.io/projected/16f1aa89-3c8a-4d86-9b12-5db500c2ef23-kube-api-access-x26sf\") pod \"managed-serviceaccount-addon-agent-554c84b6f5-sgbxr\" (UID: \"16f1aa89-3c8a-4d86-9b12-5db500c2ef23\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-554c84b6f5-sgbxr" Apr 23 13:32:58.486675 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:58.486650 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/16f1aa89-3c8a-4d86-9b12-5db500c2ef23-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-554c84b6f5-sgbxr\" (UID: \"16f1aa89-3c8a-4d86-9b12-5db500c2ef23\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-554c84b6f5-sgbxr" Apr 23 13:32:58.493049 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:58.493025 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x26sf\" (UniqueName: \"kubernetes.io/projected/16f1aa89-3c8a-4d86-9b12-5db500c2ef23-kube-api-access-x26sf\") pod \"managed-serviceaccount-addon-agent-554c84b6f5-sgbxr\" (UID: \"16f1aa89-3c8a-4d86-9b12-5db500c2ef23\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-554c84b6f5-sgbxr" Apr 23 13:32:58.585448 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:58.585396 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d7d1ff65-f2dc-4196-954f-af1ce00e7800-tmp\") pod \"klusterlet-addon-workmgr-7fc49968d7-hc2br\" (UID: \"d7d1ff65-f2dc-4196-954f-af1ce00e7800\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7fc49968d7-hc2br" Apr 23 13:32:58.585448 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:58.585451 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rgftl\" (UniqueName: \"kubernetes.io/projected/d7d1ff65-f2dc-4196-954f-af1ce00e7800-kube-api-access-rgftl\") pod \"klusterlet-addon-workmgr-7fc49968d7-hc2br\" (UID: \"d7d1ff65-f2dc-4196-954f-af1ce00e7800\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7fc49968d7-hc2br" Apr 23 13:32:58.585627 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:58.585514 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/d7d1ff65-f2dc-4196-954f-af1ce00e7800-klusterlet-config\") pod \"klusterlet-addon-workmgr-7fc49968d7-hc2br\" (UID: \"d7d1ff65-f2dc-4196-954f-af1ce00e7800\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7fc49968d7-hc2br" Apr 23 13:32:58.585862 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:58.585838 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d7d1ff65-f2dc-4196-954f-af1ce00e7800-tmp\") pod \"klusterlet-addon-workmgr-7fc49968d7-hc2br\" (UID: \"d7d1ff65-f2dc-4196-954f-af1ce00e7800\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7fc49968d7-hc2br" Apr 23 13:32:58.587875 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:58.587853 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/d7d1ff65-f2dc-4196-954f-af1ce00e7800-klusterlet-config\") pod \"klusterlet-addon-workmgr-7fc49968d7-hc2br\" (UID: \"d7d1ff65-f2dc-4196-954f-af1ce00e7800\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7fc49968d7-hc2br" Apr 23 13:32:58.593956 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:58.593904 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgftl\" (UniqueName: \"kubernetes.io/projected/d7d1ff65-f2dc-4196-954f-af1ce00e7800-kube-api-access-rgftl\") pod \"klusterlet-addon-workmgr-7fc49968d7-hc2br\" (UID: \"d7d1ff65-f2dc-4196-954f-af1ce00e7800\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7fc49968d7-hc2br" Apr 23 13:32:58.667796 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:58.667763 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-554c84b6f5-sgbxr" Apr 23 13:32:58.712792 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:58.712756 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7fc49968d7-hc2br" Apr 23 13:32:58.787008 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:58.786976 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-554c84b6f5-sgbxr"] Apr 23 13:32:58.790714 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:32:58.790687 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16f1aa89_3c8a_4d86_9b12_5db500c2ef23.slice/crio-91bc09ade4c2e90c4ec51583de107f86a4be2093f05ec0cbcaffc65a1f557b56 WatchSource:0}: Error finding container 91bc09ade4c2e90c4ec51583de107f86a4be2093f05ec0cbcaffc65a1f557b56: Status 404 returned error can't find the container with id 91bc09ade4c2e90c4ec51583de107f86a4be2093f05ec0cbcaffc65a1f557b56 Apr 23 13:32:58.831222 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:58.831195 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7fc49968d7-hc2br"] Apr 23 13:32:58.834079 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:32:58.834043 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7d1ff65_f2dc_4196_954f_af1ce00e7800.slice/crio-3838d9b26f09a0a98d7e9ec1364637dd6f06a3acc91b20453c2de540fb0f74f5 WatchSource:0}: Error finding container 3838d9b26f09a0a98d7e9ec1364637dd6f06a3acc91b20453c2de540fb0f74f5: Status 404 returned error can't find the container with id 3838d9b26f09a0a98d7e9ec1364637dd6f06a3acc91b20453c2de540fb0f74f5 Apr 23 13:32:59.414485 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:59.414443 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-554c84b6f5-sgbxr" event={"ID":"16f1aa89-3c8a-4d86-9b12-5db500c2ef23","Type":"ContainerStarted","Data":"91bc09ade4c2e90c4ec51583de107f86a4be2093f05ec0cbcaffc65a1f557b56"} Apr 23 13:32:59.415501 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:32:59.415474 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7fc49968d7-hc2br" event={"ID":"d7d1ff65-f2dc-4196-954f-af1ce00e7800","Type":"ContainerStarted","Data":"3838d9b26f09a0a98d7e9ec1364637dd6f06a3acc91b20453c2de540fb0f74f5"} Apr 23 13:33:02.422352 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:33:02.422316 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-554c84b6f5-sgbxr" event={"ID":"16f1aa89-3c8a-4d86-9b12-5db500c2ef23","Type":"ContainerStarted","Data":"42c27000c38057ef5a3ef2ec8536cf352e11558bdc39fb0d3eb1712990df5182"} Apr 23 13:33:02.437077 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:33:02.437033 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-554c84b6f5-sgbxr" podStartSLOduration=1.5876330269999999 podStartE2EDuration="4.437020569s" podCreationTimestamp="2026-04-23 13:32:58 +0000 UTC" firstStartedPulling="2026-04-23 13:32:58.792956473 +0000 UTC m=+70.251415734" lastFinishedPulling="2026-04-23 13:33:01.642344009 +0000 UTC m=+73.100803276" observedRunningTime="2026-04-23 13:33:02.436069353 +0000 UTC m=+73.894528629" watchObservedRunningTime="2026-04-23 13:33:02.437020569 +0000 UTC m=+73.895479842" Apr 23 13:33:11.441533 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:33:11.441487 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7fc49968d7-hc2br" event={"ID":"d7d1ff65-f2dc-4196-954f-af1ce00e7800","Type":"ContainerStarted","Data":"ec72102bd9877c85c3c2f2d334be0d7abe39c7f19ae07e06988ae09e34acfc4e"} Apr 23 13:33:11.442012 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:33:11.441701 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7fc49968d7-hc2br" Apr 23 13:33:11.443279 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:33:11.443258 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7fc49968d7-hc2br" Apr 23 13:33:11.457612 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:33:11.457574 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7fc49968d7-hc2br" podStartSLOduration=1.735317943 podStartE2EDuration="13.457563144s" podCreationTimestamp="2026-04-23 13:32:58 +0000 UTC" firstStartedPulling="2026-04-23 13:32:58.83584358 +0000 UTC m=+70.294302833" lastFinishedPulling="2026-04-23 13:33:10.558088752 +0000 UTC m=+82.016548034" observedRunningTime="2026-04-23 13:33:11.45698757 +0000 UTC m=+82.915446845" watchObservedRunningTime="2026-04-23 13:33:11.457563144 +0000 UTC m=+82.916022418" Apr 23 13:33:25.695404 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:33:25.695341 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/65f33a6a-b4d7-451f-a523-ddee419b31c8-registry-tls\") pod \"image-registry-5449456df7-4lgzh\" (UID: \"65f33a6a-b4d7-451f-a523-ddee419b31c8\") " pod="openshift-image-registry/image-registry-5449456df7-4lgzh" Apr 23 13:33:25.695807 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:33:25.695509 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 13:33:25.695807 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:33:25.695527 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5449456df7-4lgzh: secret "image-registry-tls" not found Apr 23 13:33:25.695807 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:33:25.695604 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/65f33a6a-b4d7-451f-a523-ddee419b31c8-registry-tls podName:65f33a6a-b4d7-451f-a523-ddee419b31c8 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:29.695588521 +0000 UTC m=+161.154047791 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/65f33a6a-b4d7-451f-a523-ddee419b31c8-registry-tls") pod "image-registry-5449456df7-4lgzh" (UID: "65f33a6a-b4d7-451f-a523-ddee419b31c8") : secret "image-registry-tls" not found Apr 23 13:33:25.795906 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:33:25.795814 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37a73ca9-a37a-469b-8043-50d6b6f5ae10-cert\") pod \"ingress-canary-x77wr\" (UID: \"37a73ca9-a37a-469b-8043-50d6b6f5ae10\") " pod="openshift-ingress-canary/ingress-canary-x77wr" Apr 23 13:33:25.796047 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:33:25.795960 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:33:25.796047 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:33:25.796033 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37a73ca9-a37a-469b-8043-50d6b6f5ae10-cert podName:37a73ca9-a37a-469b-8043-50d6b6f5ae10 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:29.796017172 +0000 UTC m=+161.254476424 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/37a73ca9-a37a-469b-8043-50d6b6f5ae10-cert") pod "ingress-canary-x77wr" (UID: "37a73ca9-a37a-469b-8043-50d6b6f5ae10") : secret "canary-serving-cert" not found Apr 23 13:33:25.896164 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:33:25.896128 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/85ae1732-8725-4694-959f-e9e424548aca-metrics-tls\") pod \"dns-default-294bt\" (UID: \"85ae1732-8725-4694-959f-e9e424548aca\") " pod="openshift-dns/dns-default-294bt" Apr 23 13:33:25.896304 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:33:25.896267 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:33:25.896343 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:33:25.896329 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85ae1732-8725-4694-959f-e9e424548aca-metrics-tls podName:85ae1732-8725-4694-959f-e9e424548aca nodeName:}" failed. No retries permitted until 2026-04-23 13:34:29.896310841 +0000 UTC m=+161.354770093 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/85ae1732-8725-4694-959f-e9e424548aca-metrics-tls") pod "dns-default-294bt" (UID: "85ae1732-8725-4694-959f-e9e424548aca") : secret "dns-default-metrics-tls" not found Apr 23 13:33:29.417959 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:33:29.417930 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-dggd8" Apr 23 13:33:58.937769 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:33:58.937733 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-sfkld"] Apr 23 13:33:58.941108 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:33:58.941091 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-sfkld" Apr 23 13:33:58.944565 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:33:58.944540 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0dcb0e48-f774-49cd-8b04-58a5050a5ff2-metrics-certs\") pod \"network-metrics-daemon-564gb\" (UID: \"0dcb0e48-f774-49cd-8b04-58a5050a5ff2\") " pod="openshift-multus/network-metrics-daemon-564gb" Apr 23 13:33:58.944700 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:33:58.944626 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 13:33:58.944700 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:33:58.944682 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0dcb0e48-f774-49cd-8b04-58a5050a5ff2-metrics-certs podName:0dcb0e48-f774-49cd-8b04-58a5050a5ff2 nodeName:}" failed. No retries permitted until 2026-04-23 13:36:00.944667788 +0000 UTC m=+252.403127040 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0dcb0e48-f774-49cd-8b04-58a5050a5ff2-metrics-certs") pod "network-metrics-daemon-564gb" (UID: "0dcb0e48-f774-49cd-8b04-58a5050a5ff2") : secret "metrics-daemon-secret" not found Apr 23 13:33:58.946577 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:33:58.946555 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 23 13:33:58.946690 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:33:58.946636 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-w252c\"" Apr 23 13:33:58.948035 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:33:58.948011 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 23 13:33:58.962164 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:33:58.962135 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-sfkld"] Apr 23 13:33:59.045132 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:33:59.045088 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24lkl\" (UniqueName: \"kubernetes.io/projected/16dee9ba-f0e8-474c-8023-1231f9070d98-kube-api-access-24lkl\") pod \"migrator-74bb7799d9-sfkld\" (UID: \"16dee9ba-f0e8-474c-8023-1231f9070d98\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-sfkld" Apr 23 13:33:59.145655 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:33:59.145623 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-24lkl\" (UniqueName: \"kubernetes.io/projected/16dee9ba-f0e8-474c-8023-1231f9070d98-kube-api-access-24lkl\") pod \"migrator-74bb7799d9-sfkld\" (UID: \"16dee9ba-f0e8-474c-8023-1231f9070d98\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-sfkld" Apr 23 13:33:59.156781 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:33:59.156745 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-24lkl\" (UniqueName: \"kubernetes.io/projected/16dee9ba-f0e8-474c-8023-1231f9070d98-kube-api-access-24lkl\") pod \"migrator-74bb7799d9-sfkld\" (UID: \"16dee9ba-f0e8-474c-8023-1231f9070d98\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-sfkld" Apr 23 13:33:59.249446 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:33:59.249346 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-sfkld" Apr 23 13:33:59.363631 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:33:59.363598 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-sfkld"] Apr 23 13:33:59.367103 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:33:59.367069 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16dee9ba_f0e8_474c_8023_1231f9070d98.slice/crio-c0def5b52c50e84df67c63d0196ea90581c8f2d4f552b44102dd5255e4d92bc6 WatchSource:0}: Error finding container c0def5b52c50e84df67c63d0196ea90581c8f2d4f552b44102dd5255e4d92bc6: Status 404 returned error can't find the container with id c0def5b52c50e84df67c63d0196ea90581c8f2d4f552b44102dd5255e4d92bc6 Apr 23 13:33:59.536088 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:33:59.536005 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-sfkld" event={"ID":"16dee9ba-f0e8-474c-8023-1231f9070d98","Type":"ContainerStarted","Data":"c0def5b52c50e84df67c63d0196ea90581c8f2d4f552b44102dd5255e4d92bc6"} Apr 23 13:34:00.478531 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:00.478512 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-cj9dc_8ddfabc2-1040-4841-9473-ed5ba1c0c775/dns-node-resolver/0.log" Apr 23 13:34:00.542171 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:00.542136 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-sfkld" event={"ID":"16dee9ba-f0e8-474c-8023-1231f9070d98","Type":"ContainerStarted","Data":"18578b1e887b3ad64f7d92727ee5260199e1beb1841341973520661ed1182655"} Apr 23 13:34:01.484713 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:01.484686 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-jfnjq_098e4208-e230-428a-af72-f1aa64c09ce0/node-ca/0.log" Apr 23 13:34:01.546930 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:01.546897 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-sfkld" event={"ID":"16dee9ba-f0e8-474c-8023-1231f9070d98","Type":"ContainerStarted","Data":"361515dbe9e853bf6639390349456a0d6e8d35ffca11951a463a25049d3a4428"} Apr 23 13:34:01.562709 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:01.562664 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-sfkld" podStartSLOduration=2.463092752 podStartE2EDuration="3.562650096s" podCreationTimestamp="2026-04-23 13:33:58 +0000 UTC" firstStartedPulling="2026-04-23 13:33:59.368796928 +0000 UTC m=+130.827256184" lastFinishedPulling="2026-04-23 13:34:00.468354276 +0000 UTC m=+131.926813528" observedRunningTime="2026-04-23 13:34:01.562402828 +0000 UTC m=+133.020862101" watchObservedRunningTime="2026-04-23 13:34:01.562650096 +0000 UTC m=+133.021109370" Apr 23 13:34:24.857121 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:34:24.857065 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-5449456df7-4lgzh" podUID="65f33a6a-b4d7-451f-a523-ddee419b31c8" Apr 23 13:34:24.875247 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:34:24.875203 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-x77wr" podUID="37a73ca9-a37a-469b-8043-50d6b6f5ae10" Apr 23 13:34:24.971716 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:34:24.971684 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-294bt" podUID="85ae1732-8725-4694-959f-e9e424548aca" Apr 23 13:34:25.609763 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:25.609731 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5449456df7-4lgzh" Apr 23 13:34:26.176845 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:34:26.176806 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-564gb" podUID="0dcb0e48-f774-49cd-8b04-58a5050a5ff2" Apr 23 13:34:26.516929 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:26.516858 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-rb4nw"] Apr 23 13:34:26.520099 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:26.520082 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-rb4nw" Apr 23 13:34:26.527235 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:26.527218 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 13:34:26.528683 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:26.528664 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 13:34:26.529364 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:26.529351 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 13:34:26.529940 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:26.529927 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 13:34:26.530001 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:26.529970 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-9khvh\"" Apr 23 13:34:26.543584 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:26.543555 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-rb4nw"] Apr 23 13:34:26.545628 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:26.545605 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/abf891cd-631a-4743-8e63-cfa25d73e6c1-crio-socket\") pod \"insights-runtime-extractor-rb4nw\" (UID: \"abf891cd-631a-4743-8e63-cfa25d73e6c1\") " pod="openshift-insights/insights-runtime-extractor-rb4nw" Apr 23 13:34:26.545735 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:26.545646 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/abf891cd-631a-4743-8e63-cfa25d73e6c1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rb4nw\" (UID: \"abf891cd-631a-4743-8e63-cfa25d73e6c1\") " pod="openshift-insights/insights-runtime-extractor-rb4nw" Apr 23 13:34:26.545735 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:26.545666 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkjwf\" (UniqueName: \"kubernetes.io/projected/abf891cd-631a-4743-8e63-cfa25d73e6c1-kube-api-access-fkjwf\") pod \"insights-runtime-extractor-rb4nw\" (UID: \"abf891cd-631a-4743-8e63-cfa25d73e6c1\") " pod="openshift-insights/insights-runtime-extractor-rb4nw" Apr 23 13:34:26.545735 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:26.545685 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/abf891cd-631a-4743-8e63-cfa25d73e6c1-data-volume\") pod \"insights-runtime-extractor-rb4nw\" (UID: \"abf891cd-631a-4743-8e63-cfa25d73e6c1\") " pod="openshift-insights/insights-runtime-extractor-rb4nw" Apr 23 13:34:26.545850 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:26.545774 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/abf891cd-631a-4743-8e63-cfa25d73e6c1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rb4nw\" (UID: \"abf891cd-631a-4743-8e63-cfa25d73e6c1\") " pod="openshift-insights/insights-runtime-extractor-rb4nw" Apr 23 13:34:26.646605 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:26.646571 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/abf891cd-631a-4743-8e63-cfa25d73e6c1-data-volume\") pod \"insights-runtime-extractor-rb4nw\" (UID: \"abf891cd-631a-4743-8e63-cfa25d73e6c1\") " pod="openshift-insights/insights-runtime-extractor-rb4nw" Apr 23 13:34:26.646605 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:26.646618 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/abf891cd-631a-4743-8e63-cfa25d73e6c1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rb4nw\" (UID: \"abf891cd-631a-4743-8e63-cfa25d73e6c1\") " pod="openshift-insights/insights-runtime-extractor-rb4nw" Apr 23 13:34:26.646853 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:26.646671 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/abf891cd-631a-4743-8e63-cfa25d73e6c1-crio-socket\") pod \"insights-runtime-extractor-rb4nw\" (UID: \"abf891cd-631a-4743-8e63-cfa25d73e6c1\") " pod="openshift-insights/insights-runtime-extractor-rb4nw" Apr 23 13:34:26.646853 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:26.646697 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/abf891cd-631a-4743-8e63-cfa25d73e6c1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rb4nw\" (UID: \"abf891cd-631a-4743-8e63-cfa25d73e6c1\") " pod="openshift-insights/insights-runtime-extractor-rb4nw" Apr 23 13:34:26.646853 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:26.646715 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fkjwf\" (UniqueName: \"kubernetes.io/projected/abf891cd-631a-4743-8e63-cfa25d73e6c1-kube-api-access-fkjwf\") pod \"insights-runtime-extractor-rb4nw\" (UID: \"abf891cd-631a-4743-8e63-cfa25d73e6c1\") " pod="openshift-insights/insights-runtime-extractor-rb4nw" Apr 23 13:34:26.646853 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:26.646782 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/abf891cd-631a-4743-8e63-cfa25d73e6c1-crio-socket\") pod \"insights-runtime-extractor-rb4nw\" (UID: \"abf891cd-631a-4743-8e63-cfa25d73e6c1\") " pod="openshift-insights/insights-runtime-extractor-rb4nw" Apr 23 13:34:26.647434 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:26.647391 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/abf891cd-631a-4743-8e63-cfa25d73e6c1-data-volume\") pod \"insights-runtime-extractor-rb4nw\" (UID: \"abf891cd-631a-4743-8e63-cfa25d73e6c1\") " pod="openshift-insights/insights-runtime-extractor-rb4nw" Apr 23 13:34:26.647740 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:26.647719 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/abf891cd-631a-4743-8e63-cfa25d73e6c1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rb4nw\" (UID: \"abf891cd-631a-4743-8e63-cfa25d73e6c1\") " pod="openshift-insights/insights-runtime-extractor-rb4nw" Apr 23 13:34:26.649139 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:26.649115 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/abf891cd-631a-4743-8e63-cfa25d73e6c1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rb4nw\" (UID: \"abf891cd-631a-4743-8e63-cfa25d73e6c1\") " pod="openshift-insights/insights-runtime-extractor-rb4nw" Apr 23 13:34:26.658464 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:26.658435 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkjwf\" (UniqueName: \"kubernetes.io/projected/abf891cd-631a-4743-8e63-cfa25d73e6c1-kube-api-access-fkjwf\") pod \"insights-runtime-extractor-rb4nw\" (UID: \"abf891cd-631a-4743-8e63-cfa25d73e6c1\") " pod="openshift-insights/insights-runtime-extractor-rb4nw" Apr 23 13:34:26.828863 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:26.828778 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-rb4nw" Apr 23 13:34:26.943893 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:26.943862 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-rb4nw"] Apr 23 13:34:26.946903 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:34:26.946873 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabf891cd_631a_4743_8e63_cfa25d73e6c1.slice/crio-f684375589cf6eb7342efe8866113a1c4dac449cbe39a8bb7e496dbf4f695634 WatchSource:0}: Error finding container f684375589cf6eb7342efe8866113a1c4dac449cbe39a8bb7e496dbf4f695634: Status 404 returned error can't find the container with id f684375589cf6eb7342efe8866113a1c4dac449cbe39a8bb7e496dbf4f695634 Apr 23 13:34:27.620123 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:27.620082 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rb4nw" event={"ID":"abf891cd-631a-4743-8e63-cfa25d73e6c1","Type":"ContainerStarted","Data":"6fb9179c67a3b8c97e4cf1b6070e15f032d847d47800b2f0af6af1a5c8a165ac"} Apr 23 13:34:27.620484 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:27.620127 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rb4nw" event={"ID":"abf891cd-631a-4743-8e63-cfa25d73e6c1","Type":"ContainerStarted","Data":"a370d1ff279d86be22ab873a8ac2591b537213a585196b1c2a09d49195b41680"} Apr 23 13:34:27.620484 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:27.620144 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rb4nw" event={"ID":"abf891cd-631a-4743-8e63-cfa25d73e6c1","Type":"ContainerStarted","Data":"f684375589cf6eb7342efe8866113a1c4dac449cbe39a8bb7e496dbf4f695634"} Apr 23 13:34:29.625657 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:29.625612 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rb4nw" event={"ID":"abf891cd-631a-4743-8e63-cfa25d73e6c1","Type":"ContainerStarted","Data":"9b04cf53d74e936a465faf3aba0449a28d4331a519a8992cd1b6e7fa0cf06cc9"} Apr 23 13:34:29.644352 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:29.644302 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-rb4nw" podStartSLOduration=1.660078977 podStartE2EDuration="3.64428836s" podCreationTimestamp="2026-04-23 13:34:26 +0000 UTC" firstStartedPulling="2026-04-23 13:34:26.994585127 +0000 UTC m=+158.453044379" lastFinishedPulling="2026-04-23 13:34:28.978794506 +0000 UTC m=+160.437253762" observedRunningTime="2026-04-23 13:34:29.643105366 +0000 UTC m=+161.101564640" watchObservedRunningTime="2026-04-23 13:34:29.64428836 +0000 UTC m=+161.102747636" Apr 23 13:34:29.772264 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:29.772232 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/65f33a6a-b4d7-451f-a523-ddee419b31c8-registry-tls\") pod \"image-registry-5449456df7-4lgzh\" (UID: \"65f33a6a-b4d7-451f-a523-ddee419b31c8\") " pod="openshift-image-registry/image-registry-5449456df7-4lgzh" Apr 23 13:34:29.774685 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:29.774659 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/65f33a6a-b4d7-451f-a523-ddee419b31c8-registry-tls\") pod \"image-registry-5449456df7-4lgzh\" (UID: \"65f33a6a-b4d7-451f-a523-ddee419b31c8\") " pod="openshift-image-registry/image-registry-5449456df7-4lgzh" Apr 23 13:34:29.813029 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:29.813002 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-nw8fz\"" Apr 23 13:34:29.820950 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:29.820932 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5449456df7-4lgzh" Apr 23 13:34:29.873019 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:29.872985 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37a73ca9-a37a-469b-8043-50d6b6f5ae10-cert\") pod \"ingress-canary-x77wr\" (UID: \"37a73ca9-a37a-469b-8043-50d6b6f5ae10\") " pod="openshift-ingress-canary/ingress-canary-x77wr" Apr 23 13:34:29.876241 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:29.876150 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37a73ca9-a37a-469b-8043-50d6b6f5ae10-cert\") pod \"ingress-canary-x77wr\" (UID: \"37a73ca9-a37a-469b-8043-50d6b6f5ae10\") " pod="openshift-ingress-canary/ingress-canary-x77wr" Apr 23 13:34:29.942464 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:29.942441 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5449456df7-4lgzh"] Apr 23 13:34:29.944796 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:34:29.944767 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65f33a6a_b4d7_451f_a523_ddee419b31c8.slice/crio-17ca5c6c30bc8bfc223c93198de1356943fc78b4dbf3067621fa3a1fd8770315 WatchSource:0}: Error finding container 17ca5c6c30bc8bfc223c93198de1356943fc78b4dbf3067621fa3a1fd8770315: Status 404 returned error can't find the container with id 17ca5c6c30bc8bfc223c93198de1356943fc78b4dbf3067621fa3a1fd8770315 Apr 23 13:34:29.973963 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:29.973939 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/85ae1732-8725-4694-959f-e9e424548aca-metrics-tls\") pod \"dns-default-294bt\" (UID: \"85ae1732-8725-4694-959f-e9e424548aca\") " pod="openshift-dns/dns-default-294bt" Apr 23 13:34:29.975963 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:29.975939 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/85ae1732-8725-4694-959f-e9e424548aca-metrics-tls\") pod \"dns-default-294bt\" (UID: \"85ae1732-8725-4694-959f-e9e424548aca\") " pod="openshift-dns/dns-default-294bt" Apr 23 13:34:30.628997 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:30.628956 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5449456df7-4lgzh" event={"ID":"65f33a6a-b4d7-451f-a523-ddee419b31c8","Type":"ContainerStarted","Data":"bf4634eebbd57fc70cc9e1fb8e5e0613bff9d582dbaa5794103c3c4a31881958"} Apr 23 13:34:30.628997 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:30.629001 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5449456df7-4lgzh" event={"ID":"65f33a6a-b4d7-451f-a523-ddee419b31c8","Type":"ContainerStarted","Data":"17ca5c6c30bc8bfc223c93198de1356943fc78b4dbf3067621fa3a1fd8770315"} Apr 23 13:34:30.654229 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:30.654171 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5449456df7-4lgzh" podStartSLOduration=161.654152932 podStartE2EDuration="2m41.654152932s" podCreationTimestamp="2026-04-23 13:31:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:34:30.652219869 +0000 UTC m=+162.110679145" watchObservedRunningTime="2026-04-23 13:34:30.654152932 +0000 UTC m=+162.112612210" Apr 23 13:34:31.632021 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:31.631963 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5449456df7-4lgzh" Apr 23 13:34:36.073163 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.073083 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-dgg7g"] Apr 23 13:34:36.078692 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.078676 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dgg7g" Apr 23 13:34:36.081592 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.081536 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 13:34:36.082155 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.082123 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 23 13:34:36.082849 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.082333 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-8dqsw\"" Apr 23 13:34:36.082849 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.082546 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 23 13:34:36.082849 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.082820 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 13:34:36.083337 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.083278 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 13:34:36.089926 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.089637 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-dgg7g"] Apr 23 13:34:36.116717 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.116692 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-jbdzg"] Apr 23 13:34:36.119585 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.119553 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bfaad655-80b2-4abf-9055-f34b4dc51fc8-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-dgg7g\" (UID: \"bfaad655-80b2-4abf-9055-f34b4dc51fc8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dgg7g" Apr 23 13:34:36.119687 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.119604 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/bfaad655-80b2-4abf-9055-f34b4dc51fc8-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-dgg7g\" (UID: \"bfaad655-80b2-4abf-9055-f34b4dc51fc8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dgg7g" Apr 23 13:34:36.119687 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.119676 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bfaad655-80b2-4abf-9055-f34b4dc51fc8-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-dgg7g\" (UID: \"bfaad655-80b2-4abf-9055-f34b4dc51fc8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dgg7g" Apr 23 13:34:36.119756 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.119704 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r57wv\" (UniqueName: \"kubernetes.io/projected/bfaad655-80b2-4abf-9055-f34b4dc51fc8-kube-api-access-r57wv\") pod \"openshift-state-metrics-9d44df66c-dgg7g\" (UID: \"bfaad655-80b2-4abf-9055-f34b4dc51fc8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dgg7g" Apr 23 13:34:36.120875 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.120857 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-jbdzg" Apr 23 13:34:36.123288 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.123266 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 23 13:34:36.123288 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.123282 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 23 13:34:36.123491 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.123387 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-hljgz\"" Apr 23 13:34:36.123700 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.123685 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 23 13:34:36.136763 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.136741 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-mfnwd"] Apr 23 13:34:36.140322 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.140304 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-jbdzg"] Apr 23 13:34:36.140428 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.140404 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-mfnwd" Apr 23 13:34:36.142800 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.142783 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 13:34:36.143106 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.143089 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-24rls\"" Apr 23 13:34:36.143195 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.143171 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 13:34:36.143195 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.143183 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 13:34:36.220569 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.220530 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a08bb45f-4b1e-4c3f-98f3-d171b1c3212c-node-exporter-tls\") pod \"node-exporter-mfnwd\" (UID: \"a08bb45f-4b1e-4c3f-98f3-d171b1c3212c\") " pod="openshift-monitoring/node-exporter-mfnwd" Apr 23 13:34:36.220569 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.220571 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a08bb45f-4b1e-4c3f-98f3-d171b1c3212c-root\") pod \"node-exporter-mfnwd\" (UID: \"a08bb45f-4b1e-4c3f-98f3-d171b1c3212c\") " pod="openshift-monitoring/node-exporter-mfnwd" Apr 23 13:34:36.220781 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.220601 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7v6c\" (UniqueName: \"kubernetes.io/projected/e382350e-9e57-4746-b19b-b8c4c615a7b2-kube-api-access-z7v6c\") pod \"kube-state-metrics-69db897b98-jbdzg\" (UID: \"e382350e-9e57-4746-b19b-b8c4c615a7b2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jbdzg" Apr 23 13:34:36.220781 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.220649 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a08bb45f-4b1e-4c3f-98f3-d171b1c3212c-sys\") pod \"node-exporter-mfnwd\" (UID: \"a08bb45f-4b1e-4c3f-98f3-d171b1c3212c\") " pod="openshift-monitoring/node-exporter-mfnwd" Apr 23 13:34:36.220781 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.220672 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a08bb45f-4b1e-4c3f-98f3-d171b1c3212c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-mfnwd\" (UID: \"a08bb45f-4b1e-4c3f-98f3-d171b1c3212c\") " pod="openshift-monitoring/node-exporter-mfnwd" Apr 23 13:34:36.220781 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.220755 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a08bb45f-4b1e-4c3f-98f3-d171b1c3212c-node-exporter-wtmp\") pod \"node-exporter-mfnwd\" (UID: \"a08bb45f-4b1e-4c3f-98f3-d171b1c3212c\") " pod="openshift-monitoring/node-exporter-mfnwd" Apr 23 13:34:36.220908 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.220792 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a08bb45f-4b1e-4c3f-98f3-d171b1c3212c-node-exporter-accelerators-collector-config\") pod \"node-exporter-mfnwd\" (UID: \"a08bb45f-4b1e-4c3f-98f3-d171b1c3212c\") " pod="openshift-monitoring/node-exporter-mfnwd" Apr 23 13:34:36.220908 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.220817 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e382350e-9e57-4746-b19b-b8c4c615a7b2-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-jbdzg\" (UID: \"e382350e-9e57-4746-b19b-b8c4c615a7b2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jbdzg" Apr 23 13:34:36.220908 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.220854 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bfaad655-80b2-4abf-9055-f34b4dc51fc8-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-dgg7g\" (UID: \"bfaad655-80b2-4abf-9055-f34b4dc51fc8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dgg7g" Apr 23 13:34:36.220908 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.220903 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a08bb45f-4b1e-4c3f-98f3-d171b1c3212c-node-exporter-textfile\") pod \"node-exporter-mfnwd\" (UID: \"a08bb45f-4b1e-4c3f-98f3-d171b1c3212c\") " pod="openshift-monitoring/node-exporter-mfnwd" Apr 23 13:34:36.221086 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.220922 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/e382350e-9e57-4746-b19b-b8c4c615a7b2-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-jbdzg\" (UID: \"e382350e-9e57-4746-b19b-b8c4c615a7b2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jbdzg" Apr 23 13:34:36.221086 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.220943 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/bfaad655-80b2-4abf-9055-f34b4dc51fc8-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-dgg7g\" (UID: \"bfaad655-80b2-4abf-9055-f34b4dc51fc8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dgg7g" Apr 23 13:34:36.221086 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.220962 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e382350e-9e57-4746-b19b-b8c4c615a7b2-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-jbdzg\" (UID: \"e382350e-9e57-4746-b19b-b8c4c615a7b2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jbdzg" Apr 23 13:34:36.221086 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.221002 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bfaad655-80b2-4abf-9055-f34b4dc51fc8-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-dgg7g\" (UID: \"bfaad655-80b2-4abf-9055-f34b4dc51fc8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dgg7g" Apr 23 13:34:36.221086 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.221026 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e382350e-9e57-4746-b19b-b8c4c615a7b2-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-jbdzg\" (UID: \"e382350e-9e57-4746-b19b-b8c4c615a7b2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jbdzg" Apr 23 13:34:36.221086 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.221045 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r57wv\" (UniqueName: \"kubernetes.io/projected/bfaad655-80b2-4abf-9055-f34b4dc51fc8-kube-api-access-r57wv\") pod \"openshift-state-metrics-9d44df66c-dgg7g\" (UID: \"bfaad655-80b2-4abf-9055-f34b4dc51fc8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dgg7g" Apr 23 13:34:36.221086 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.221072 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/e382350e-9e57-4746-b19b-b8c4c615a7b2-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-jbdzg\" (UID: \"e382350e-9e57-4746-b19b-b8c4c615a7b2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jbdzg" Apr 23 13:34:36.221340 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.221094 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a08bb45f-4b1e-4c3f-98f3-d171b1c3212c-metrics-client-ca\") pod \"node-exporter-mfnwd\" (UID: \"a08bb45f-4b1e-4c3f-98f3-d171b1c3212c\") " pod="openshift-monitoring/node-exporter-mfnwd" Apr 23 13:34:36.221340 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.221111 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sxnv\" (UniqueName: \"kubernetes.io/projected/a08bb45f-4b1e-4c3f-98f3-d171b1c3212c-kube-api-access-7sxnv\") pod \"node-exporter-mfnwd\" (UID: \"a08bb45f-4b1e-4c3f-98f3-d171b1c3212c\") " pod="openshift-monitoring/node-exporter-mfnwd" Apr 23 13:34:36.221693 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.221675 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bfaad655-80b2-4abf-9055-f34b4dc51fc8-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-dgg7g\" (UID: \"bfaad655-80b2-4abf-9055-f34b4dc51fc8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dgg7g" Apr 23 13:34:36.223309 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.223291 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bfaad655-80b2-4abf-9055-f34b4dc51fc8-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-dgg7g\" (UID: \"bfaad655-80b2-4abf-9055-f34b4dc51fc8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dgg7g" Apr 23 13:34:36.223384 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.223365 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/bfaad655-80b2-4abf-9055-f34b4dc51fc8-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-dgg7g\" (UID: \"bfaad655-80b2-4abf-9055-f34b4dc51fc8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dgg7g" Apr 23 13:34:36.228737 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.228716 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r57wv\" (UniqueName: \"kubernetes.io/projected/bfaad655-80b2-4abf-9055-f34b4dc51fc8-kube-api-access-r57wv\") pod \"openshift-state-metrics-9d44df66c-dgg7g\" (UID: \"bfaad655-80b2-4abf-9055-f34b4dc51fc8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dgg7g" Apr 23 13:34:36.322290 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.322256 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z7v6c\" (UniqueName: \"kubernetes.io/projected/e382350e-9e57-4746-b19b-b8c4c615a7b2-kube-api-access-z7v6c\") pod \"kube-state-metrics-69db897b98-jbdzg\" (UID: \"e382350e-9e57-4746-b19b-b8c4c615a7b2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jbdzg" Apr 23 13:34:36.322290 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.322298 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a08bb45f-4b1e-4c3f-98f3-d171b1c3212c-sys\") pod \"node-exporter-mfnwd\" (UID: \"a08bb45f-4b1e-4c3f-98f3-d171b1c3212c\") " pod="openshift-monitoring/node-exporter-mfnwd" Apr 23 13:34:36.322550 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.322355 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a08bb45f-4b1e-4c3f-98f3-d171b1c3212c-sys\") pod \"node-exporter-mfnwd\" (UID: \"a08bb45f-4b1e-4c3f-98f3-d171b1c3212c\") " pod="openshift-monitoring/node-exporter-mfnwd" Apr 23 13:34:36.322550 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.322386 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a08bb45f-4b1e-4c3f-98f3-d171b1c3212c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-mfnwd\" (UID: \"a08bb45f-4b1e-4c3f-98f3-d171b1c3212c\") " pod="openshift-monitoring/node-exporter-mfnwd" Apr 23 13:34:36.322550 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.322456 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a08bb45f-4b1e-4c3f-98f3-d171b1c3212c-node-exporter-wtmp\") pod \"node-exporter-mfnwd\" (UID: \"a08bb45f-4b1e-4c3f-98f3-d171b1c3212c\") " pod="openshift-monitoring/node-exporter-mfnwd" Apr 23 13:34:36.322550 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.322485 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a08bb45f-4b1e-4c3f-98f3-d171b1c3212c-node-exporter-accelerators-collector-config\") pod \"node-exporter-mfnwd\" (UID: \"a08bb45f-4b1e-4c3f-98f3-d171b1c3212c\") " pod="openshift-monitoring/node-exporter-mfnwd" Apr 23 13:34:36.322550 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.322510 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e382350e-9e57-4746-b19b-b8c4c615a7b2-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-jbdzg\" (UID: \"e382350e-9e57-4746-b19b-b8c4c615a7b2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jbdzg" Apr 23 13:34:36.322550 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.322533 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a08bb45f-4b1e-4c3f-98f3-d171b1c3212c-node-exporter-textfile\") pod \"node-exporter-mfnwd\" (UID: \"a08bb45f-4b1e-4c3f-98f3-d171b1c3212c\") " pod="openshift-monitoring/node-exporter-mfnwd" Apr 23 13:34:36.322835 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.322560 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/e382350e-9e57-4746-b19b-b8c4c615a7b2-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-jbdzg\" (UID: \"e382350e-9e57-4746-b19b-b8c4c615a7b2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jbdzg" Apr 23 13:34:36.322835 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.322598 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e382350e-9e57-4746-b19b-b8c4c615a7b2-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-jbdzg\" (UID: \"e382350e-9e57-4746-b19b-b8c4c615a7b2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jbdzg" Apr 23 13:34:36.322835 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.322633 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e382350e-9e57-4746-b19b-b8c4c615a7b2-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-jbdzg\" (UID: \"e382350e-9e57-4746-b19b-b8c4c615a7b2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jbdzg" Apr 23 13:34:36.322835 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.322666 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/e382350e-9e57-4746-b19b-b8c4c615a7b2-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-jbdzg\" (UID: \"e382350e-9e57-4746-b19b-b8c4c615a7b2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jbdzg" Apr 23 13:34:36.322835 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.322692 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a08bb45f-4b1e-4c3f-98f3-d171b1c3212c-metrics-client-ca\") pod \"node-exporter-mfnwd\" (UID: \"a08bb45f-4b1e-4c3f-98f3-d171b1c3212c\") " pod="openshift-monitoring/node-exporter-mfnwd" Apr 23 13:34:36.322835 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.322694 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a08bb45f-4b1e-4c3f-98f3-d171b1c3212c-node-exporter-wtmp\") pod \"node-exporter-mfnwd\" (UID: \"a08bb45f-4b1e-4c3f-98f3-d171b1c3212c\") " pod="openshift-monitoring/node-exporter-mfnwd" Apr 23 13:34:36.322835 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.322715 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7sxnv\" (UniqueName: \"kubernetes.io/projected/a08bb45f-4b1e-4c3f-98f3-d171b1c3212c-kube-api-access-7sxnv\") pod \"node-exporter-mfnwd\" (UID: \"a08bb45f-4b1e-4c3f-98f3-d171b1c3212c\") " pod="openshift-monitoring/node-exporter-mfnwd" Apr 23 13:34:36.322835 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.322764 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a08bb45f-4b1e-4c3f-98f3-d171b1c3212c-node-exporter-tls\") pod \"node-exporter-mfnwd\" (UID: \"a08bb45f-4b1e-4c3f-98f3-d171b1c3212c\") " pod="openshift-monitoring/node-exporter-mfnwd" Apr 23 13:34:36.322835 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.322793 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a08bb45f-4b1e-4c3f-98f3-d171b1c3212c-root\") pod \"node-exporter-mfnwd\" (UID: \"a08bb45f-4b1e-4c3f-98f3-d171b1c3212c\") " pod="openshift-monitoring/node-exporter-mfnwd" Apr 23 13:34:36.323283 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.322875 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a08bb45f-4b1e-4c3f-98f3-d171b1c3212c-root\") pod \"node-exporter-mfnwd\" (UID: \"a08bb45f-4b1e-4c3f-98f3-d171b1c3212c\") " pod="openshift-monitoring/node-exporter-mfnwd" Apr 23 13:34:36.323283 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.323034 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/e382350e-9e57-4746-b19b-b8c4c615a7b2-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-jbdzg\" (UID: \"e382350e-9e57-4746-b19b-b8c4c615a7b2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jbdzg" Apr 23 13:34:36.323283 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.323190 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a08bb45f-4b1e-4c3f-98f3-d171b1c3212c-node-exporter-textfile\") pod \"node-exporter-mfnwd\" (UID: \"a08bb45f-4b1e-4c3f-98f3-d171b1c3212c\") " pod="openshift-monitoring/node-exporter-mfnwd" Apr 23 13:34:36.323696 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.323530 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a08bb45f-4b1e-4c3f-98f3-d171b1c3212c-node-exporter-accelerators-collector-config\") pod \"node-exporter-mfnwd\" (UID: \"a08bb45f-4b1e-4c3f-98f3-d171b1c3212c\") " pod="openshift-monitoring/node-exporter-mfnwd" Apr 23 13:34:36.323696 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.323551 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a08bb45f-4b1e-4c3f-98f3-d171b1c3212c-metrics-client-ca\") pod \"node-exporter-mfnwd\" (UID: \"a08bb45f-4b1e-4c3f-98f3-d171b1c3212c\") " pod="openshift-monitoring/node-exporter-mfnwd" Apr 23 13:34:36.323879 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.323857 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e382350e-9e57-4746-b19b-b8c4c615a7b2-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-jbdzg\" (UID: \"e382350e-9e57-4746-b19b-b8c4c615a7b2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jbdzg" Apr 23 13:34:36.324385 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.324357 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/e382350e-9e57-4746-b19b-b8c4c615a7b2-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-jbdzg\" (UID: \"e382350e-9e57-4746-b19b-b8c4c615a7b2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jbdzg" Apr 23 13:34:36.325140 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.325115 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a08bb45f-4b1e-4c3f-98f3-d171b1c3212c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-mfnwd\" (UID: \"a08bb45f-4b1e-4c3f-98f3-d171b1c3212c\") " pod="openshift-monitoring/node-exporter-mfnwd" Apr 23 13:34:36.325232 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.325214 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a08bb45f-4b1e-4c3f-98f3-d171b1c3212c-node-exporter-tls\") pod \"node-exporter-mfnwd\" (UID: \"a08bb45f-4b1e-4c3f-98f3-d171b1c3212c\") " pod="openshift-monitoring/node-exporter-mfnwd" Apr 23 13:34:36.325753 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.325736 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e382350e-9e57-4746-b19b-b8c4c615a7b2-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-jbdzg\" (UID: \"e382350e-9e57-4746-b19b-b8c4c615a7b2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jbdzg" Apr 23 13:34:36.325845 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.325826 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e382350e-9e57-4746-b19b-b8c4c615a7b2-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-jbdzg\" (UID: \"e382350e-9e57-4746-b19b-b8c4c615a7b2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jbdzg" Apr 23 13:34:36.336905 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.336876 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sxnv\" (UniqueName: \"kubernetes.io/projected/a08bb45f-4b1e-4c3f-98f3-d171b1c3212c-kube-api-access-7sxnv\") pod \"node-exporter-mfnwd\" (UID: \"a08bb45f-4b1e-4c3f-98f3-d171b1c3212c\") " pod="openshift-monitoring/node-exporter-mfnwd" Apr 23 13:34:36.337083 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.337066 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7v6c\" (UniqueName: \"kubernetes.io/projected/e382350e-9e57-4746-b19b-b8c4c615a7b2-kube-api-access-z7v6c\") pod \"kube-state-metrics-69db897b98-jbdzg\" (UID: \"e382350e-9e57-4746-b19b-b8c4c615a7b2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jbdzg" Apr 23 13:34:36.393052 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.393021 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dgg7g" Apr 23 13:34:36.430610 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.430578 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-jbdzg" Apr 23 13:34:36.449497 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.449464 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-mfnwd" Apr 23 13:34:36.459804 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:34:36.459762 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda08bb45f_4b1e_4c3f_98f3_d171b1c3212c.slice/crio-2e764ff62aa837fdf37b2117f6ac5c9a8f6ab629affbc68e503487894c24dc56 WatchSource:0}: Error finding container 2e764ff62aa837fdf37b2117f6ac5c9a8f6ab629affbc68e503487894c24dc56: Status 404 returned error can't find the container with id 2e764ff62aa837fdf37b2117f6ac5c9a8f6ab629affbc68e503487894c24dc56 Apr 23 13:34:36.522008 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.521979 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-dgg7g"] Apr 23 13:34:36.524866 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:34:36.524837 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfaad655_80b2_4abf_9055_f34b4dc51fc8.slice/crio-0e8e361c31e387e5025f7a8786e4e2d6f2550571a33fa8f6a9d70e90abb0a589 WatchSource:0}: Error finding container 0e8e361c31e387e5025f7a8786e4e2d6f2550571a33fa8f6a9d70e90abb0a589: Status 404 returned error can't find the container with id 0e8e361c31e387e5025f7a8786e4e2d6f2550571a33fa8f6a9d70e90abb0a589 Apr 23 13:34:36.559499 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.559474 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-jbdzg"] Apr 23 13:34:36.567493 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:34:36.567464 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode382350e_9e57_4746_b19b_b8c4c615a7b2.slice/crio-74fcc357b7e4b8cfafe622c168c552876eab36c44c332e7ee44d9ee3e5cb6518 WatchSource:0}: Error finding container 74fcc357b7e4b8cfafe622c168c552876eab36c44c332e7ee44d9ee3e5cb6518: Status 404 returned error can't find the container with id 74fcc357b7e4b8cfafe622c168c552876eab36c44c332e7ee44d9ee3e5cb6518 Apr 23 13:34:36.644772 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.644739 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-jbdzg" event={"ID":"e382350e-9e57-4746-b19b-b8c4c615a7b2","Type":"ContainerStarted","Data":"74fcc357b7e4b8cfafe622c168c552876eab36c44c332e7ee44d9ee3e5cb6518"} Apr 23 13:34:36.648271 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.648244 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-mfnwd" event={"ID":"a08bb45f-4b1e-4c3f-98f3-d171b1c3212c","Type":"ContainerStarted","Data":"2e764ff62aa837fdf37b2117f6ac5c9a8f6ab629affbc68e503487894c24dc56"} Apr 23 13:34:36.649930 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.649896 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dgg7g" event={"ID":"bfaad655-80b2-4abf-9055-f34b4dc51fc8","Type":"ContainerStarted","Data":"c54ed80ca49aafec862949e1a6991a73ab4266371ec69398719d561dc34572d5"} Apr 23 13:34:36.650019 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.649930 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dgg7g" event={"ID":"bfaad655-80b2-4abf-9055-f34b4dc51fc8","Type":"ContainerStarted","Data":"8fde04734a82496f14e6b27735e7c4c5b8a7740cce895876eaba2c14689f5381"} Apr 23 13:34:36.650019 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:36.649955 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dgg7g" event={"ID":"bfaad655-80b2-4abf-9055-f34b4dc51fc8","Type":"ContainerStarted","Data":"0e8e361c31e387e5025f7a8786e4e2d6f2550571a33fa8f6a9d70e90abb0a589"} Apr 23 13:34:37.654297 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:37.654257 2576 generic.go:358] "Generic (PLEG): container finished" podID="a08bb45f-4b1e-4c3f-98f3-d171b1c3212c" containerID="7327010fc6df3444cd22a75de8e4f9c80abbbd1a47add9e2826dd3acd6e0c8c3" exitCode=0 Apr 23 13:34:37.654745 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:37.654347 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-mfnwd" event={"ID":"a08bb45f-4b1e-4c3f-98f3-d171b1c3212c","Type":"ContainerDied","Data":"7327010fc6df3444cd22a75de8e4f9c80abbbd1a47add9e2826dd3acd6e0c8c3"} Apr 23 13:34:38.200051 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:38.199971 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-77c79b9cd5-w4mpm"] Apr 23 13:34:38.203566 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:38.203541 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-77c79b9cd5-w4mpm" Apr 23 13:34:38.208250 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:38.208105 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 23 13:34:38.208250 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:38.208136 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 23 13:34:38.208250 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:38.208193 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 23 13:34:38.208250 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:38.208232 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 23 13:34:38.209770 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:38.209747 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-fpm7ibmeesjue\"" Apr 23 13:34:38.210013 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:38.209994 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-mzh2w\"" Apr 23 13:34:38.210218 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:38.210202 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 23 13:34:38.220754 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:38.220735 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-77c79b9cd5-w4mpm"] Apr 23 13:34:38.339655 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:38.339615 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/0c9fffe4-836e-4eb1-ab91-fe058d9c2765-secret-thanos-querier-tls\") pod \"thanos-querier-77c79b9cd5-w4mpm\" (UID: \"0c9fffe4-836e-4eb1-ab91-fe058d9c2765\") " pod="openshift-monitoring/thanos-querier-77c79b9cd5-w4mpm" Apr 23 13:34:38.339820 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:38.339665 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsv2k\" (UniqueName: \"kubernetes.io/projected/0c9fffe4-836e-4eb1-ab91-fe058d9c2765-kube-api-access-qsv2k\") pod \"thanos-querier-77c79b9cd5-w4mpm\" (UID: \"0c9fffe4-836e-4eb1-ab91-fe058d9c2765\") " pod="openshift-monitoring/thanos-querier-77c79b9cd5-w4mpm" Apr 23 13:34:38.339820 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:38.339704 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0c9fffe4-836e-4eb1-ab91-fe058d9c2765-metrics-client-ca\") pod \"thanos-querier-77c79b9cd5-w4mpm\" (UID: \"0c9fffe4-836e-4eb1-ab91-fe058d9c2765\") " pod="openshift-monitoring/thanos-querier-77c79b9cd5-w4mpm" Apr 23 13:34:38.339820 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:38.339741 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/0c9fffe4-836e-4eb1-ab91-fe058d9c2765-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-77c79b9cd5-w4mpm\" (UID: \"0c9fffe4-836e-4eb1-ab91-fe058d9c2765\") " pod="openshift-monitoring/thanos-querier-77c79b9cd5-w4mpm" Apr 23 13:34:38.339820 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:38.339765 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0c9fffe4-836e-4eb1-ab91-fe058d9c2765-secret-grpc-tls\") pod \"thanos-querier-77c79b9cd5-w4mpm\" (UID: \"0c9fffe4-836e-4eb1-ab91-fe058d9c2765\") " pod="openshift-monitoring/thanos-querier-77c79b9cd5-w4mpm" Apr 23 13:34:38.340036 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:38.339818 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/0c9fffe4-836e-4eb1-ab91-fe058d9c2765-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-77c79b9cd5-w4mpm\" (UID: \"0c9fffe4-836e-4eb1-ab91-fe058d9c2765\") " pod="openshift-monitoring/thanos-querier-77c79b9cd5-w4mpm" Apr 23 13:34:38.340036 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:38.339873 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0c9fffe4-836e-4eb1-ab91-fe058d9c2765-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-77c79b9cd5-w4mpm\" (UID: \"0c9fffe4-836e-4eb1-ab91-fe058d9c2765\") " pod="openshift-monitoring/thanos-querier-77c79b9cd5-w4mpm" Apr 23 13:34:38.340036 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:38.339919 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0c9fffe4-836e-4eb1-ab91-fe058d9c2765-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-77c79b9cd5-w4mpm\" (UID: \"0c9fffe4-836e-4eb1-ab91-fe058d9c2765\") " pod="openshift-monitoring/thanos-querier-77c79b9cd5-w4mpm" Apr 23 13:34:38.441272 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:38.441238 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0c9fffe4-836e-4eb1-ab91-fe058d9c2765-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-77c79b9cd5-w4mpm\" (UID: \"0c9fffe4-836e-4eb1-ab91-fe058d9c2765\") " pod="openshift-monitoring/thanos-querier-77c79b9cd5-w4mpm" Apr 23 13:34:38.441458 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:38.441281 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/0c9fffe4-836e-4eb1-ab91-fe058d9c2765-secret-thanos-querier-tls\") pod \"thanos-querier-77c79b9cd5-w4mpm\" (UID: \"0c9fffe4-836e-4eb1-ab91-fe058d9c2765\") " pod="openshift-monitoring/thanos-querier-77c79b9cd5-w4mpm" Apr 23 13:34:38.441458 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:38.441298 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qsv2k\" (UniqueName: \"kubernetes.io/projected/0c9fffe4-836e-4eb1-ab91-fe058d9c2765-kube-api-access-qsv2k\") pod \"thanos-querier-77c79b9cd5-w4mpm\" (UID: \"0c9fffe4-836e-4eb1-ab91-fe058d9c2765\") " pod="openshift-monitoring/thanos-querier-77c79b9cd5-w4mpm" Apr 23 13:34:38.441458 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:38.441323 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0c9fffe4-836e-4eb1-ab91-fe058d9c2765-metrics-client-ca\") pod \"thanos-querier-77c79b9cd5-w4mpm\" (UID: \"0c9fffe4-836e-4eb1-ab91-fe058d9c2765\") " pod="openshift-monitoring/thanos-querier-77c79b9cd5-w4mpm" Apr 23 13:34:38.441458 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:38.441362 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/0c9fffe4-836e-4eb1-ab91-fe058d9c2765-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-77c79b9cd5-w4mpm\" (UID: \"0c9fffe4-836e-4eb1-ab91-fe058d9c2765\") " pod="openshift-monitoring/thanos-querier-77c79b9cd5-w4mpm" Apr 23 13:34:38.441458 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:38.441387 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0c9fffe4-836e-4eb1-ab91-fe058d9c2765-secret-grpc-tls\") pod \"thanos-querier-77c79b9cd5-w4mpm\" (UID: \"0c9fffe4-836e-4eb1-ab91-fe058d9c2765\") " pod="openshift-monitoring/thanos-querier-77c79b9cd5-w4mpm" Apr 23 13:34:38.441714 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:38.441461 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/0c9fffe4-836e-4eb1-ab91-fe058d9c2765-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-77c79b9cd5-w4mpm\" (UID: \"0c9fffe4-836e-4eb1-ab91-fe058d9c2765\") " pod="openshift-monitoring/thanos-querier-77c79b9cd5-w4mpm" Apr 23 13:34:38.441714 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:38.441549 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0c9fffe4-836e-4eb1-ab91-fe058d9c2765-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-77c79b9cd5-w4mpm\" (UID: \"0c9fffe4-836e-4eb1-ab91-fe058d9c2765\") " pod="openshift-monitoring/thanos-querier-77c79b9cd5-w4mpm" Apr 23 13:34:38.442240 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:38.442208 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0c9fffe4-836e-4eb1-ab91-fe058d9c2765-metrics-client-ca\") pod \"thanos-querier-77c79b9cd5-w4mpm\" (UID: \"0c9fffe4-836e-4eb1-ab91-fe058d9c2765\") " pod="openshift-monitoring/thanos-querier-77c79b9cd5-w4mpm" Apr 23 13:34:38.444053 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:38.444026 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/0c9fffe4-836e-4eb1-ab91-fe058d9c2765-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-77c79b9cd5-w4mpm\" (UID: \"0c9fffe4-836e-4eb1-ab91-fe058d9c2765\") " pod="openshift-monitoring/thanos-querier-77c79b9cd5-w4mpm" Apr 23 13:34:38.444228 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:38.444207 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0c9fffe4-836e-4eb1-ab91-fe058d9c2765-secret-grpc-tls\") pod \"thanos-querier-77c79b9cd5-w4mpm\" (UID: \"0c9fffe4-836e-4eb1-ab91-fe058d9c2765\") " pod="openshift-monitoring/thanos-querier-77c79b9cd5-w4mpm" Apr 23 13:34:38.444315 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:38.444290 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0c9fffe4-836e-4eb1-ab91-fe058d9c2765-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-77c79b9cd5-w4mpm\" (UID: \"0c9fffe4-836e-4eb1-ab91-fe058d9c2765\") " pod="openshift-monitoring/thanos-querier-77c79b9cd5-w4mpm" Apr 23 13:34:38.444374 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:38.444341 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0c9fffe4-836e-4eb1-ab91-fe058d9c2765-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-77c79b9cd5-w4mpm\" (UID: \"0c9fffe4-836e-4eb1-ab91-fe058d9c2765\") " pod="openshift-monitoring/thanos-querier-77c79b9cd5-w4mpm" Apr 23 13:34:38.444513 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:38.444494 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/0c9fffe4-836e-4eb1-ab91-fe058d9c2765-secret-thanos-querier-tls\") pod \"thanos-querier-77c79b9cd5-w4mpm\" (UID: \"0c9fffe4-836e-4eb1-ab91-fe058d9c2765\") " pod="openshift-monitoring/thanos-querier-77c79b9cd5-w4mpm" Apr 23 13:34:38.444575 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:38.444523 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/0c9fffe4-836e-4eb1-ab91-fe058d9c2765-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-77c79b9cd5-w4mpm\" (UID: \"0c9fffe4-836e-4eb1-ab91-fe058d9c2765\") " pod="openshift-monitoring/thanos-querier-77c79b9cd5-w4mpm" Apr 23 13:34:38.449875 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:38.449857 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsv2k\" (UniqueName: \"kubernetes.io/projected/0c9fffe4-836e-4eb1-ab91-fe058d9c2765-kube-api-access-qsv2k\") pod \"thanos-querier-77c79b9cd5-w4mpm\" (UID: \"0c9fffe4-836e-4eb1-ab91-fe058d9c2765\") " pod="openshift-monitoring/thanos-querier-77c79b9cd5-w4mpm" Apr 23 13:34:38.513133 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:38.513064 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-77c79b9cd5-w4mpm" Apr 23 13:34:38.635496 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:38.635463 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-77c79b9cd5-w4mpm"] Apr 23 13:34:38.638752 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:34:38.638723 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c9fffe4_836e_4eb1_ab91_fe058d9c2765.slice/crio-a321750b9eaf038279af608cc5d3ba50aa96645e67b3f34e4d25779c9bb94532 WatchSource:0}: Error finding container a321750b9eaf038279af608cc5d3ba50aa96645e67b3f34e4d25779c9bb94532: Status 404 returned error can't find the container with id a321750b9eaf038279af608cc5d3ba50aa96645e67b3f34e4d25779c9bb94532 Apr 23 13:34:38.659283 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:38.659246 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-jbdzg" event={"ID":"e382350e-9e57-4746-b19b-b8c4c615a7b2","Type":"ContainerStarted","Data":"74833b12d4cd0177b66308a18e22d301806a6f977215e1fccbf76e41086778d0"} Apr 23 13:34:38.659695 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:38.659289 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-jbdzg" event={"ID":"e382350e-9e57-4746-b19b-b8c4c615a7b2","Type":"ContainerStarted","Data":"390beb840e08747020d2aa9903ae34aed3ce7577de2aa39f089415e0ceee518e"} Apr 23 13:34:38.659695 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:38.659305 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-jbdzg" event={"ID":"e382350e-9e57-4746-b19b-b8c4c615a7b2","Type":"ContainerStarted","Data":"ae8ad3bba803410edb5d7310375b62888f45925afe15e20fc37007628f145b76"} Apr 23 13:34:38.660486 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:38.660451 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-77c79b9cd5-w4mpm" event={"ID":"0c9fffe4-836e-4eb1-ab91-fe058d9c2765","Type":"ContainerStarted","Data":"a321750b9eaf038279af608cc5d3ba50aa96645e67b3f34e4d25779c9bb94532"} Apr 23 13:34:38.662471 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:38.662445 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-mfnwd" event={"ID":"a08bb45f-4b1e-4c3f-98f3-d171b1c3212c","Type":"ContainerStarted","Data":"b1058a4df9fa9c46dfb8852ca697007bff2a48b92f38b9543819b256e9b5e085"} Apr 23 13:34:38.662578 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:38.662479 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-mfnwd" event={"ID":"a08bb45f-4b1e-4c3f-98f3-d171b1c3212c","Type":"ContainerStarted","Data":"8cf7f1139b75064c906ac9c8b7cb4d86eb11eb5c7bb7f99ccdc9acd717a819f7"} Apr 23 13:34:38.664333 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:38.664312 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dgg7g" event={"ID":"bfaad655-80b2-4abf-9055-f34b4dc51fc8","Type":"ContainerStarted","Data":"b7cd42ec921ef3a295ca18ad16c82e9ed1e034cedd41ee70332d694e33f6540b"} Apr 23 13:34:38.677084 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:38.677042 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-jbdzg" podStartSLOduration=1.317923345 podStartE2EDuration="2.677030941s" podCreationTimestamp="2026-04-23 13:34:36 +0000 UTC" firstStartedPulling="2026-04-23 13:34:36.569255797 +0000 UTC m=+168.027715052" lastFinishedPulling="2026-04-23 13:34:37.92836338 +0000 UTC m=+169.386822648" observedRunningTime="2026-04-23 13:34:38.676199522 +0000 UTC m=+170.134658796" watchObservedRunningTime="2026-04-23 13:34:38.677030941 +0000 UTC m=+170.135490260" Apr 23 13:34:38.693677 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:38.693590 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-dgg7g" podStartSLOduration=1.411475204 podStartE2EDuration="2.693575885s" podCreationTimestamp="2026-04-23 13:34:36 +0000 UTC" firstStartedPulling="2026-04-23 13:34:36.647513043 +0000 UTC m=+168.105972296" lastFinishedPulling="2026-04-23 13:34:37.929613721 +0000 UTC m=+169.388072977" observedRunningTime="2026-04-23 13:34:38.693291806 +0000 UTC m=+170.151751082" watchObservedRunningTime="2026-04-23 13:34:38.693575885 +0000 UTC m=+170.152035172" Apr 23 13:34:38.716826 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:38.716709 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-mfnwd" podStartSLOduration=2.047803246 podStartE2EDuration="2.716693771s" podCreationTimestamp="2026-04-23 13:34:36 +0000 UTC" firstStartedPulling="2026-04-23 13:34:36.461601112 +0000 UTC m=+167.920060364" lastFinishedPulling="2026-04-23 13:34:37.130491633 +0000 UTC m=+168.588950889" observedRunningTime="2026-04-23 13:34:38.71652981 +0000 UTC m=+170.174989096" watchObservedRunningTime="2026-04-23 13:34:38.716693771 +0000 UTC m=+170.175153069" Apr 23 13:34:38.805955 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:38.805888 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-f9vzx"] Apr 23 13:34:38.810286 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:38.810265 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-f9vzx" Apr 23 13:34:38.812563 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:38.812544 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 23 13:34:38.812790 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:38.812774 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 23 13:34:38.812874 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:38.812775 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-j7szc\"" Apr 23 13:34:38.817708 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:38.817686 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-f9vzx"] Apr 23 13:34:38.946830 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:38.946795 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fljjg\" (UniqueName: \"kubernetes.io/projected/b53af7fd-f48d-4cee-aa0a-867edf1e7051-kube-api-access-fljjg\") pod \"downloads-6bcc868b7-f9vzx\" (UID: \"b53af7fd-f48d-4cee-aa0a-867edf1e7051\") " pod="openshift-console/downloads-6bcc868b7-f9vzx" Apr 23 13:34:39.047860 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:39.047817 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fljjg\" (UniqueName: \"kubernetes.io/projected/b53af7fd-f48d-4cee-aa0a-867edf1e7051-kube-api-access-fljjg\") pod \"downloads-6bcc868b7-f9vzx\" (UID: \"b53af7fd-f48d-4cee-aa0a-867edf1e7051\") " pod="openshift-console/downloads-6bcc868b7-f9vzx" Apr 23 13:34:39.059399 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:39.059333 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fljjg\" (UniqueName: \"kubernetes.io/projected/b53af7fd-f48d-4cee-aa0a-867edf1e7051-kube-api-access-fljjg\") pod \"downloads-6bcc868b7-f9vzx\" (UID: \"b53af7fd-f48d-4cee-aa0a-867edf1e7051\") " pod="openshift-console/downloads-6bcc868b7-f9vzx" Apr 23 13:34:39.120338 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:39.120301 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-f9vzx" Apr 23 13:34:39.160088 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:39.159653 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-294bt" Apr 23 13:34:39.162140 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:39.162114 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vtx8h\"" Apr 23 13:34:39.171073 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:39.171031 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-294bt" Apr 23 13:34:39.262067 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:39.262021 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-f9vzx"] Apr 23 13:34:39.267861 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:34:39.267812 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb53af7fd_f48d_4cee_aa0a_867edf1e7051.slice/crio-0ee60d665cdfcf274d48cce7e2fbd6112d586f7554e6e4c7af136f6a3bd2d458 WatchSource:0}: Error finding container 0ee60d665cdfcf274d48cce7e2fbd6112d586f7554e6e4c7af136f6a3bd2d458: Status 404 returned error can't find the container with id 0ee60d665cdfcf274d48cce7e2fbd6112d586f7554e6e4c7af136f6a3bd2d458 Apr 23 13:34:39.319199 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:39.319110 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-294bt"] Apr 23 13:34:39.321911 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:34:39.321878 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85ae1732_8725_4694_959f_e9e424548aca.slice/crio-eb3baaaf117544f5e2468778e466b1f4feacd4a59e742e97bd00ad76c5871f41 WatchSource:0}: Error finding container eb3baaaf117544f5e2468778e466b1f4feacd4a59e742e97bd00ad76c5871f41: Status 404 returned error can't find the container with id eb3baaaf117544f5e2468778e466b1f4feacd4a59e742e97bd00ad76c5871f41 Apr 23 13:34:39.668795 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:39.668754 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-294bt" event={"ID":"85ae1732-8725-4694-959f-e9e424548aca","Type":"ContainerStarted","Data":"eb3baaaf117544f5e2468778e466b1f4feacd4a59e742e97bd00ad76c5871f41"} Apr 23 13:34:39.669912 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:39.669877 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-f9vzx" event={"ID":"b53af7fd-f48d-4cee-aa0a-867edf1e7051","Type":"ContainerStarted","Data":"0ee60d665cdfcf274d48cce7e2fbd6112d586f7554e6e4c7af136f6a3bd2d458"} Apr 23 13:34:40.155923 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:40.155887 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-564gb" Apr 23 13:34:40.156717 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:40.156446 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-x77wr" Apr 23 13:34:40.159235 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:40.159214 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-8bjdp\"" Apr 23 13:34:40.167714 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:40.167471 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-x77wr" Apr 23 13:34:40.330448 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:40.330372 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-x77wr"] Apr 23 13:34:40.510768 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:40.510679 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-cf57fc8d5-52tld"] Apr 23 13:34:40.513950 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:40.513920 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-cf57fc8d5-52tld" Apr 23 13:34:40.516589 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:40.516567 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 23 13:34:40.517655 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:40.517321 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 23 13:34:40.517655 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:40.517350 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-gj84x\"" Apr 23 13:34:40.517655 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:40.517377 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 23 13:34:40.517655 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:40.517395 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-7cu13t67p99ic\"" Apr 23 13:34:40.517655 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:40.517479 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 23 13:34:40.527679 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:40.526967 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-cf57fc8d5-52tld"] Apr 23 13:34:40.558688 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:40.558652 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/6dbda42f-9298-42c4-8400-79899f5a6c90-audit-log\") pod \"metrics-server-cf57fc8d5-52tld\" (UID: \"6dbda42f-9298-42c4-8400-79899f5a6c90\") " pod="openshift-monitoring/metrics-server-cf57fc8d5-52tld" Apr 23 13:34:40.558859 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:40.558714 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/6dbda42f-9298-42c4-8400-79899f5a6c90-secret-metrics-server-client-certs\") pod \"metrics-server-cf57fc8d5-52tld\" (UID: \"6dbda42f-9298-42c4-8400-79899f5a6c90\") " pod="openshift-monitoring/metrics-server-cf57fc8d5-52tld" Apr 23 13:34:40.558859 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:40.558818 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dbda42f-9298-42c4-8400-79899f5a6c90-client-ca-bundle\") pod \"metrics-server-cf57fc8d5-52tld\" (UID: \"6dbda42f-9298-42c4-8400-79899f5a6c90\") " pod="openshift-monitoring/metrics-server-cf57fc8d5-52tld" Apr 23 13:34:40.558859 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:40.558852 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/6dbda42f-9298-42c4-8400-79899f5a6c90-secret-metrics-server-tls\") pod \"metrics-server-cf57fc8d5-52tld\" (UID: \"6dbda42f-9298-42c4-8400-79899f5a6c90\") " pod="openshift-monitoring/metrics-server-cf57fc8d5-52tld" Apr 23 13:34:40.559014 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:40.558875 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6dbda42f-9298-42c4-8400-79899f5a6c90-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-cf57fc8d5-52tld\" (UID: \"6dbda42f-9298-42c4-8400-79899f5a6c90\") " pod="openshift-monitoring/metrics-server-cf57fc8d5-52tld" Apr 23 13:34:40.559014 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:40.558899 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmlsh\" (UniqueName: \"kubernetes.io/projected/6dbda42f-9298-42c4-8400-79899f5a6c90-kube-api-access-gmlsh\") pod \"metrics-server-cf57fc8d5-52tld\" (UID: \"6dbda42f-9298-42c4-8400-79899f5a6c90\") " pod="openshift-monitoring/metrics-server-cf57fc8d5-52tld" Apr 23 13:34:40.559014 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:40.558933 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/6dbda42f-9298-42c4-8400-79899f5a6c90-metrics-server-audit-profiles\") pod \"metrics-server-cf57fc8d5-52tld\" (UID: \"6dbda42f-9298-42c4-8400-79899f5a6c90\") " pod="openshift-monitoring/metrics-server-cf57fc8d5-52tld" Apr 23 13:34:40.659991 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:40.659948 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/6dbda42f-9298-42c4-8400-79899f5a6c90-secret-metrics-server-client-certs\") pod \"metrics-server-cf57fc8d5-52tld\" (UID: \"6dbda42f-9298-42c4-8400-79899f5a6c90\") " pod="openshift-monitoring/metrics-server-cf57fc8d5-52tld" Apr 23 13:34:40.660151 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:40.660064 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dbda42f-9298-42c4-8400-79899f5a6c90-client-ca-bundle\") pod \"metrics-server-cf57fc8d5-52tld\" (UID: \"6dbda42f-9298-42c4-8400-79899f5a6c90\") " pod="openshift-monitoring/metrics-server-cf57fc8d5-52tld" Apr 23 13:34:40.660151 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:40.660107 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/6dbda42f-9298-42c4-8400-79899f5a6c90-secret-metrics-server-tls\") pod \"metrics-server-cf57fc8d5-52tld\" (UID: \"6dbda42f-9298-42c4-8400-79899f5a6c90\") " pod="openshift-monitoring/metrics-server-cf57fc8d5-52tld" Apr 23 13:34:40.660151 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:40.660140 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6dbda42f-9298-42c4-8400-79899f5a6c90-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-cf57fc8d5-52tld\" (UID: \"6dbda42f-9298-42c4-8400-79899f5a6c90\") " pod="openshift-monitoring/metrics-server-cf57fc8d5-52tld" Apr 23 13:34:40.660311 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:40.660165 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gmlsh\" (UniqueName: \"kubernetes.io/projected/6dbda42f-9298-42c4-8400-79899f5a6c90-kube-api-access-gmlsh\") pod \"metrics-server-cf57fc8d5-52tld\" (UID: \"6dbda42f-9298-42c4-8400-79899f5a6c90\") " pod="openshift-monitoring/metrics-server-cf57fc8d5-52tld" Apr 23 13:34:40.660311 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:40.660207 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/6dbda42f-9298-42c4-8400-79899f5a6c90-metrics-server-audit-profiles\") pod \"metrics-server-cf57fc8d5-52tld\" (UID: \"6dbda42f-9298-42c4-8400-79899f5a6c90\") " pod="openshift-monitoring/metrics-server-cf57fc8d5-52tld" Apr 23 13:34:40.660311 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:40.660253 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/6dbda42f-9298-42c4-8400-79899f5a6c90-audit-log\") pod \"metrics-server-cf57fc8d5-52tld\" (UID: \"6dbda42f-9298-42c4-8400-79899f5a6c90\") " pod="openshift-monitoring/metrics-server-cf57fc8d5-52tld" Apr 23 13:34:40.661010 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:40.660979 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6dbda42f-9298-42c4-8400-79899f5a6c90-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-cf57fc8d5-52tld\" (UID: \"6dbda42f-9298-42c4-8400-79899f5a6c90\") " pod="openshift-monitoring/metrics-server-cf57fc8d5-52tld" Apr 23 13:34:40.661250 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:40.661226 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/6dbda42f-9298-42c4-8400-79899f5a6c90-audit-log\") pod \"metrics-server-cf57fc8d5-52tld\" (UID: \"6dbda42f-9298-42c4-8400-79899f5a6c90\") " pod="openshift-monitoring/metrics-server-cf57fc8d5-52tld" Apr 23 13:34:40.661487 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:40.661436 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/6dbda42f-9298-42c4-8400-79899f5a6c90-metrics-server-audit-profiles\") pod \"metrics-server-cf57fc8d5-52tld\" (UID: \"6dbda42f-9298-42c4-8400-79899f5a6c90\") " pod="openshift-monitoring/metrics-server-cf57fc8d5-52tld" Apr 23 13:34:40.663094 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:40.663052 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dbda42f-9298-42c4-8400-79899f5a6c90-client-ca-bundle\") pod \"metrics-server-cf57fc8d5-52tld\" (UID: \"6dbda42f-9298-42c4-8400-79899f5a6c90\") " pod="openshift-monitoring/metrics-server-cf57fc8d5-52tld" Apr 23 13:34:40.664645 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:40.664598 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/6dbda42f-9298-42c4-8400-79899f5a6c90-secret-metrics-server-client-certs\") pod \"metrics-server-cf57fc8d5-52tld\" (UID: \"6dbda42f-9298-42c4-8400-79899f5a6c90\") " pod="openshift-monitoring/metrics-server-cf57fc8d5-52tld" Apr 23 13:34:40.665238 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:40.665195 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/6dbda42f-9298-42c4-8400-79899f5a6c90-secret-metrics-server-tls\") pod \"metrics-server-cf57fc8d5-52tld\" (UID: \"6dbda42f-9298-42c4-8400-79899f5a6c90\") " pod="openshift-monitoring/metrics-server-cf57fc8d5-52tld" Apr 23 13:34:40.668166 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:40.668141 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmlsh\" (UniqueName: \"kubernetes.io/projected/6dbda42f-9298-42c4-8400-79899f5a6c90-kube-api-access-gmlsh\") pod \"metrics-server-cf57fc8d5-52tld\" (UID: \"6dbda42f-9298-42c4-8400-79899f5a6c90\") " pod="openshift-monitoring/metrics-server-cf57fc8d5-52tld" Apr 23 13:34:40.831767 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:40.831675 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-cf57fc8d5-52tld" Apr 23 13:34:41.416050 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:34:41.416015 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a73ca9_a37a_469b_8043_50d6b6f5ae10.slice/crio-f7d93c6a346342171473f9bf21300c0db30ac95195cce0e4dd5c5b327c79153a WatchSource:0}: Error finding container f7d93c6a346342171473f9bf21300c0db30ac95195cce0e4dd5c5b327c79153a: Status 404 returned error can't find the container with id f7d93c6a346342171473f9bf21300c0db30ac95195cce0e4dd5c5b327c79153a Apr 23 13:34:41.626911 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:41.626867 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-cf57fc8d5-52tld"] Apr 23 13:34:41.630204 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:34:41.630154 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6dbda42f_9298_42c4_8400_79899f5a6c90.slice/crio-99ede8807048b46890ae6bd1a0f13c9cf1327df41e05b484e2b670ae2bcc1510 WatchSource:0}: Error finding container 99ede8807048b46890ae6bd1a0f13c9cf1327df41e05b484e2b670ae2bcc1510: Status 404 returned error can't find the container with id 99ede8807048b46890ae6bd1a0f13c9cf1327df41e05b484e2b670ae2bcc1510 Apr 23 13:34:41.682658 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:41.682598 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-x77wr" event={"ID":"37a73ca9-a37a-469b-8043-50d6b6f5ae10","Type":"ContainerStarted","Data":"f7d93c6a346342171473f9bf21300c0db30ac95195cce0e4dd5c5b327c79153a"} Apr 23 13:34:41.684812 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:41.684782 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-77c79b9cd5-w4mpm" event={"ID":"0c9fffe4-836e-4eb1-ab91-fe058d9c2765","Type":"ContainerStarted","Data":"0de1f2c30d7d50fb6593fa677de0d06e5b3491875742a529c4137222437a07e4"} Apr 23 13:34:41.684917 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:41.684822 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-77c79b9cd5-w4mpm" event={"ID":"0c9fffe4-836e-4eb1-ab91-fe058d9c2765","Type":"ContainerStarted","Data":"4a06a61a65cf6232bb95b71d664f82ad7afffa0727a2d70350a970e83ba9d04f"} Apr 23 13:34:41.689444 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:41.689381 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-294bt" event={"ID":"85ae1732-8725-4694-959f-e9e424548aca","Type":"ContainerStarted","Data":"8a846ab9bf5ad7d474126dad0eddc343868efb58b540774c7a65ce2f7be68b94"} Apr 23 13:34:41.691458 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:41.691387 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-cf57fc8d5-52tld" event={"ID":"6dbda42f-9298-42c4-8400-79899f5a6c90","Type":"ContainerStarted","Data":"99ede8807048b46890ae6bd1a0f13c9cf1327df41e05b484e2b670ae2bcc1510"} Apr 23 13:34:42.698100 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:42.698060 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-77c79b9cd5-w4mpm" event={"ID":"0c9fffe4-836e-4eb1-ab91-fe058d9c2765","Type":"ContainerStarted","Data":"67037665e52c4daefc3cc5a7a20305bf59395124f13d54062eccfc3c58df5736"} Apr 23 13:34:42.700718 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:42.700682 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-294bt" event={"ID":"85ae1732-8725-4694-959f-e9e424548aca","Type":"ContainerStarted","Data":"3a4844692e3159224e9fbe315ef42154cfe4998816c39f220eeb61ff6d99973f"} Apr 23 13:34:42.701037 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:42.700882 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-294bt" Apr 23 13:34:42.722677 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:42.718472 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-294bt" podStartSLOduration=139.562506664 podStartE2EDuration="2m21.718452146s" podCreationTimestamp="2026-04-23 13:32:21 +0000 UTC" firstStartedPulling="2026-04-23 13:34:39.324012208 +0000 UTC m=+170.782471460" lastFinishedPulling="2026-04-23 13:34:41.479957676 +0000 UTC m=+172.938416942" observedRunningTime="2026-04-23 13:34:42.718229093 +0000 UTC m=+174.176688367" watchObservedRunningTime="2026-04-23 13:34:42.718452146 +0000 UTC m=+174.176911418" Apr 23 13:34:43.707104 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:43.707064 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-77c79b9cd5-w4mpm" event={"ID":"0c9fffe4-836e-4eb1-ab91-fe058d9c2765","Type":"ContainerStarted","Data":"b143121d79ecd76a24e0b4d024a7b45df1aec690efa9cbe4422f6d1df9101092"} Apr 23 13:34:44.713617 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:44.713580 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-77c79b9cd5-w4mpm" event={"ID":"0c9fffe4-836e-4eb1-ab91-fe058d9c2765","Type":"ContainerStarted","Data":"2d7edd8dd295bd2eb4eca71b6105c7ac618d3e27c55267d0a3b2d859cfeca2b3"} Apr 23 13:34:44.713617 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:44.713623 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-77c79b9cd5-w4mpm" event={"ID":"0c9fffe4-836e-4eb1-ab91-fe058d9c2765","Type":"ContainerStarted","Data":"012eaa36b75cd401b035661d9eaf3be52e45f4a60aff3b415250a22789f7cbfb"} Apr 23 13:34:44.714156 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:44.713779 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-77c79b9cd5-w4mpm" Apr 23 13:34:44.715301 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:44.715260 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-cf57fc8d5-52tld" event={"ID":"6dbda42f-9298-42c4-8400-79899f5a6c90","Type":"ContainerStarted","Data":"e9b98b896e807ff9a8793164aeb5303dbe3496e3161ce7053487f519611cb1cb"} Apr 23 13:34:44.716991 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:44.716952 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-x77wr" event={"ID":"37a73ca9-a37a-469b-8043-50d6b6f5ae10","Type":"ContainerStarted","Data":"b96e73e9c3fd887fce6c6af5cbf04956677f09eeb6678c89b44c773e79a0685a"} Apr 23 13:34:44.736231 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:44.736173 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-77c79b9cd5-w4mpm" podStartSLOduration=2.755173406 podStartE2EDuration="6.736154757s" podCreationTimestamp="2026-04-23 13:34:38 +0000 UTC" firstStartedPulling="2026-04-23 13:34:38.640567771 +0000 UTC m=+170.099027023" lastFinishedPulling="2026-04-23 13:34:42.621549107 +0000 UTC m=+174.080008374" observedRunningTime="2026-04-23 13:34:44.734193541 +0000 UTC m=+176.192652818" watchObservedRunningTime="2026-04-23 13:34:44.736154757 +0000 UTC m=+176.194614035" Apr 23 13:34:44.749513 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:44.749460 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-x77wr" podStartSLOduration=141.449750299 podStartE2EDuration="2m23.749448649s" podCreationTimestamp="2026-04-23 13:32:21 +0000 UTC" firstStartedPulling="2026-04-23 13:34:41.466539806 +0000 UTC m=+172.924999073" lastFinishedPulling="2026-04-23 13:34:43.766238168 +0000 UTC m=+175.224697423" observedRunningTime="2026-04-23 13:34:44.747657396 +0000 UTC m=+176.206116672" watchObservedRunningTime="2026-04-23 13:34:44.749448649 +0000 UTC m=+176.207907922" Apr 23 13:34:44.764287 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:44.764226 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-cf57fc8d5-52tld" podStartSLOduration=2.6366666050000003 podStartE2EDuration="4.764207895s" podCreationTimestamp="2026-04-23 13:34:40 +0000 UTC" firstStartedPulling="2026-04-23 13:34:41.633586418 +0000 UTC m=+173.092045671" lastFinishedPulling="2026-04-23 13:34:43.761127697 +0000 UTC m=+175.219586961" observedRunningTime="2026-04-23 13:34:44.762861877 +0000 UTC m=+176.221321151" watchObservedRunningTime="2026-04-23 13:34:44.764207895 +0000 UTC m=+176.222667176" Apr 23 13:34:48.127883 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:48.127843 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6cdf6d9b7-9h84v"] Apr 23 13:34:48.131481 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:48.131455 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cdf6d9b7-9h84v" Apr 23 13:34:48.134025 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:48.133996 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 23 13:34:48.134805 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:48.134781 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 23 13:34:48.134919 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:48.134854 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 23 13:34:48.134919 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:48.134867 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 23 13:34:48.135053 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:48.134787 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-vhfkt\"" Apr 23 13:34:48.135159 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:48.135142 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 23 13:34:48.139808 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:48.139781 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6cdf6d9b7-9h84v"] Apr 23 13:34:48.230037 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:48.229996 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b29902e1-0905-4a3c-99d5-2f5c3a966422-service-ca\") pod \"console-6cdf6d9b7-9h84v\" (UID: \"b29902e1-0905-4a3c-99d5-2f5c3a966422\") " pod="openshift-console/console-6cdf6d9b7-9h84v" Apr 23 13:34:48.230222 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:48.230103 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b29902e1-0905-4a3c-99d5-2f5c3a966422-console-oauth-config\") pod \"console-6cdf6d9b7-9h84v\" (UID: \"b29902e1-0905-4a3c-99d5-2f5c3a966422\") " pod="openshift-console/console-6cdf6d9b7-9h84v" Apr 23 13:34:48.230222 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:48.230137 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b29902e1-0905-4a3c-99d5-2f5c3a966422-console-config\") pod \"console-6cdf6d9b7-9h84v\" (UID: \"b29902e1-0905-4a3c-99d5-2f5c3a966422\") " pod="openshift-console/console-6cdf6d9b7-9h84v" Apr 23 13:34:48.230222 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:48.230165 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b29902e1-0905-4a3c-99d5-2f5c3a966422-oauth-serving-cert\") pod \"console-6cdf6d9b7-9h84v\" (UID: \"b29902e1-0905-4a3c-99d5-2f5c3a966422\") " pod="openshift-console/console-6cdf6d9b7-9h84v" Apr 23 13:34:48.230388 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:48.230266 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b29902e1-0905-4a3c-99d5-2f5c3a966422-console-serving-cert\") pod \"console-6cdf6d9b7-9h84v\" (UID: \"b29902e1-0905-4a3c-99d5-2f5c3a966422\") " pod="openshift-console/console-6cdf6d9b7-9h84v" Apr 23 13:34:48.230388 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:48.230303 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npppq\" (UniqueName: \"kubernetes.io/projected/b29902e1-0905-4a3c-99d5-2f5c3a966422-kube-api-access-npppq\") pod \"console-6cdf6d9b7-9h84v\" (UID: \"b29902e1-0905-4a3c-99d5-2f5c3a966422\") " pod="openshift-console/console-6cdf6d9b7-9h84v" Apr 23 13:34:48.331566 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:48.331502 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b29902e1-0905-4a3c-99d5-2f5c3a966422-console-oauth-config\") pod \"console-6cdf6d9b7-9h84v\" (UID: \"b29902e1-0905-4a3c-99d5-2f5c3a966422\") " pod="openshift-console/console-6cdf6d9b7-9h84v" Apr 23 13:34:48.331566 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:48.331558 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b29902e1-0905-4a3c-99d5-2f5c3a966422-console-config\") pod \"console-6cdf6d9b7-9h84v\" (UID: \"b29902e1-0905-4a3c-99d5-2f5c3a966422\") " pod="openshift-console/console-6cdf6d9b7-9h84v" Apr 23 13:34:48.331805 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:48.331586 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b29902e1-0905-4a3c-99d5-2f5c3a966422-oauth-serving-cert\") pod \"console-6cdf6d9b7-9h84v\" (UID: \"b29902e1-0905-4a3c-99d5-2f5c3a966422\") " pod="openshift-console/console-6cdf6d9b7-9h84v" Apr 23 13:34:48.331805 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:48.331650 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b29902e1-0905-4a3c-99d5-2f5c3a966422-console-serving-cert\") pod \"console-6cdf6d9b7-9h84v\" (UID: \"b29902e1-0905-4a3c-99d5-2f5c3a966422\") " pod="openshift-console/console-6cdf6d9b7-9h84v" Apr 23 13:34:48.331805 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:48.331678 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-npppq\" (UniqueName: \"kubernetes.io/projected/b29902e1-0905-4a3c-99d5-2f5c3a966422-kube-api-access-npppq\") pod \"console-6cdf6d9b7-9h84v\" (UID: \"b29902e1-0905-4a3c-99d5-2f5c3a966422\") " pod="openshift-console/console-6cdf6d9b7-9h84v" Apr 23 13:34:48.331805 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:48.331743 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b29902e1-0905-4a3c-99d5-2f5c3a966422-service-ca\") pod \"console-6cdf6d9b7-9h84v\" (UID: \"b29902e1-0905-4a3c-99d5-2f5c3a966422\") " pod="openshift-console/console-6cdf6d9b7-9h84v" Apr 23 13:34:48.332490 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:48.332456 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b29902e1-0905-4a3c-99d5-2f5c3a966422-oauth-serving-cert\") pod \"console-6cdf6d9b7-9h84v\" (UID: \"b29902e1-0905-4a3c-99d5-2f5c3a966422\") " pod="openshift-console/console-6cdf6d9b7-9h84v" Apr 23 13:34:48.332605 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:48.332456 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b29902e1-0905-4a3c-99d5-2f5c3a966422-console-config\") pod \"console-6cdf6d9b7-9h84v\" (UID: \"b29902e1-0905-4a3c-99d5-2f5c3a966422\") " pod="openshift-console/console-6cdf6d9b7-9h84v" Apr 23 13:34:48.332605 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:48.332513 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b29902e1-0905-4a3c-99d5-2f5c3a966422-service-ca\") pod \"console-6cdf6d9b7-9h84v\" (UID: \"b29902e1-0905-4a3c-99d5-2f5c3a966422\") " pod="openshift-console/console-6cdf6d9b7-9h84v" Apr 23 13:34:48.334261 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:48.334232 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b29902e1-0905-4a3c-99d5-2f5c3a966422-console-oauth-config\") pod \"console-6cdf6d9b7-9h84v\" (UID: \"b29902e1-0905-4a3c-99d5-2f5c3a966422\") " pod="openshift-console/console-6cdf6d9b7-9h84v" Apr 23 13:34:48.334703 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:48.334685 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b29902e1-0905-4a3c-99d5-2f5c3a966422-console-serving-cert\") pod \"console-6cdf6d9b7-9h84v\" (UID: \"b29902e1-0905-4a3c-99d5-2f5c3a966422\") " pod="openshift-console/console-6cdf6d9b7-9h84v" Apr 23 13:34:48.347738 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:48.347711 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-npppq\" (UniqueName: \"kubernetes.io/projected/b29902e1-0905-4a3c-99d5-2f5c3a966422-kube-api-access-npppq\") pod \"console-6cdf6d9b7-9h84v\" (UID: \"b29902e1-0905-4a3c-99d5-2f5c3a966422\") " pod="openshift-console/console-6cdf6d9b7-9h84v" Apr 23 13:34:48.442144 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:48.442107 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cdf6d9b7-9h84v" Apr 23 13:34:48.574377 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:48.574350 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6cdf6d9b7-9h84v"] Apr 23 13:34:48.576805 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:34:48.576776 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb29902e1_0905_4a3c_99d5_2f5c3a966422.slice/crio-f06f83549d3f57f952f2fa7482c87269407197a2a469fcedf330072d8d71acb5 WatchSource:0}: Error finding container f06f83549d3f57f952f2fa7482c87269407197a2a469fcedf330072d8d71acb5: Status 404 returned error can't find the container with id f06f83549d3f57f952f2fa7482c87269407197a2a469fcedf330072d8d71acb5 Apr 23 13:34:48.730237 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:48.730152 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cdf6d9b7-9h84v" event={"ID":"b29902e1-0905-4a3c-99d5-2f5c3a966422","Type":"ContainerStarted","Data":"f06f83549d3f57f952f2fa7482c87269407197a2a469fcedf330072d8d71acb5"} Apr 23 13:34:49.826969 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:49.826930 2576 patch_prober.go:28] interesting pod/image-registry-5449456df7-4lgzh container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 13:34:49.827434 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:49.826988 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-5449456df7-4lgzh" podUID="65f33a6a-b4d7-451f-a523-ddee419b31c8" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:34:50.726999 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:50.726973 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-77c79b9cd5-w4mpm" Apr 23 13:34:52.639592 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:52.639552 2576 patch_prober.go:28] interesting pod/image-registry-5449456df7-4lgzh container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 13:34:52.640024 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:52.639634 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-5449456df7-4lgzh" podUID="65f33a6a-b4d7-451f-a523-ddee419b31c8" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:34:52.711312 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:52.711278 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-294bt" Apr 23 13:34:55.752478 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:55.752441 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-f9vzx" event={"ID":"b53af7fd-f48d-4cee-aa0a-867edf1e7051","Type":"ContainerStarted","Data":"721c656a2ad0d3af35370400c017bce433f510ecc83837fb9c6314db6c07e089"} Apr 23 13:34:55.752875 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:55.752708 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-f9vzx" Apr 23 13:34:55.753787 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:55.753765 2576 patch_prober.go:28] interesting pod/downloads-6bcc868b7-f9vzx container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.132.0.16:8080/\": dial tcp 10.132.0.16:8080: connect: connection refused" start-of-body= Apr 23 13:34:55.753873 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:55.753822 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-6bcc868b7-f9vzx" podUID="b53af7fd-f48d-4cee-aa0a-867edf1e7051" containerName="download-server" probeResult="failure" output="Get \"http://10.132.0.16:8080/\": dial tcp 10.132.0.16:8080: connect: connection refused" Apr 23 13:34:55.769978 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:55.769928 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-f9vzx" podStartSLOduration=1.377864661 podStartE2EDuration="17.769912356s" podCreationTimestamp="2026-04-23 13:34:38 +0000 UTC" firstStartedPulling="2026-04-23 13:34:39.270751913 +0000 UTC m=+170.729211178" lastFinishedPulling="2026-04-23 13:34:55.662799621 +0000 UTC m=+187.121258873" observedRunningTime="2026-04-23 13:34:55.767747214 +0000 UTC m=+187.226206485" watchObservedRunningTime="2026-04-23 13:34:55.769912356 +0000 UTC m=+187.228371630" Apr 23 13:34:56.769238 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:56.769202 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-f9vzx" Apr 23 13:34:59.069251 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:59.069172 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5449456df7-4lgzh"] Apr 23 13:34:59.073660 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:59.073629 2576 patch_prober.go:28] interesting pod/image-registry-5449456df7-4lgzh container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 13:34:59.073803 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:59.073678 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-5449456df7-4lgzh" podUID="65f33a6a-b4d7-451f-a523-ddee419b31c8" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:34:59.767711 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:59.767664 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cdf6d9b7-9h84v" event={"ID":"b29902e1-0905-4a3c-99d5-2f5c3a966422","Type":"ContainerStarted","Data":"2c89c984b00ff316270306f1ba1bd5636620d35e467aa86ae61a0383954a300e"} Apr 23 13:34:59.785844 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:34:59.785794 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6cdf6d9b7-9h84v" podStartSLOduration=1.588410453 podStartE2EDuration="11.785779343s" podCreationTimestamp="2026-04-23 13:34:48 +0000 UTC" firstStartedPulling="2026-04-23 13:34:48.579228769 +0000 UTC m=+180.037688035" lastFinishedPulling="2026-04-23 13:34:58.776597672 +0000 UTC m=+190.235056925" observedRunningTime="2026-04-23 13:34:59.784192685 +0000 UTC m=+191.242651969" watchObservedRunningTime="2026-04-23 13:34:59.785779343 +0000 UTC m=+191.244238617" Apr 23 13:35:00.832109 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:00.832075 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-cf57fc8d5-52tld" Apr 23 13:35:00.832493 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:00.832145 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-cf57fc8d5-52tld" Apr 23 13:35:02.778646 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:02.778610 2576 generic.go:358] "Generic (PLEG): container finished" podID="16f1aa89-3c8a-4d86-9b12-5db500c2ef23" containerID="42c27000c38057ef5a3ef2ec8536cf352e11558bdc39fb0d3eb1712990df5182" exitCode=255 Apr 23 13:35:02.779181 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:02.778688 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-554c84b6f5-sgbxr" event={"ID":"16f1aa89-3c8a-4d86-9b12-5db500c2ef23","Type":"ContainerDied","Data":"42c27000c38057ef5a3ef2ec8536cf352e11558bdc39fb0d3eb1712990df5182"} Apr 23 13:35:02.786305 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:02.786280 2576 scope.go:117] "RemoveContainer" containerID="42c27000c38057ef5a3ef2ec8536cf352e11558bdc39fb0d3eb1712990df5182" Apr 23 13:35:03.783844 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:03.783801 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-554c84b6f5-sgbxr" event={"ID":"16f1aa89-3c8a-4d86-9b12-5db500c2ef23","Type":"ContainerStarted","Data":"404f4c6c951e5a83d125c04b16dab6f2b3ae52ce037e355eec9aa504b970b781"} Apr 23 13:35:07.035318 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:07.035284 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6cdf6d9b7-9h84v"] Apr 23 13:35:08.442817 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:08.442781 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6cdf6d9b7-9h84v" Apr 23 13:35:09.073527 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:09.073500 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5449456df7-4lgzh" Apr 23 13:35:20.837081 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:20.837049 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-cf57fc8d5-52tld" Apr 23 13:35:20.840964 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:20.840936 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-cf57fc8d5-52tld" Apr 23 13:35:24.090341 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:24.090274 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-5449456df7-4lgzh" podUID="65f33a6a-b4d7-451f-a523-ddee419b31c8" containerName="registry" containerID="cri-o://bf4634eebbd57fc70cc9e1fb8e5e0613bff9d582dbaa5794103c3c4a31881958" gracePeriod=30 Apr 23 13:35:24.324143 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:24.324121 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5449456df7-4lgzh" Apr 23 13:35:24.438232 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:24.438204 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/65f33a6a-b4d7-451f-a523-ddee419b31c8-trusted-ca\") pod \"65f33a6a-b4d7-451f-a523-ddee419b31c8\" (UID: \"65f33a6a-b4d7-451f-a523-ddee419b31c8\") " Apr 23 13:35:24.438406 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:24.438272 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/65f33a6a-b4d7-451f-a523-ddee419b31c8-installation-pull-secrets\") pod \"65f33a6a-b4d7-451f-a523-ddee419b31c8\" (UID: \"65f33a6a-b4d7-451f-a523-ddee419b31c8\") " Apr 23 13:35:24.438406 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:24.438290 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/65f33a6a-b4d7-451f-a523-ddee419b31c8-registry-certificates\") pod \"65f33a6a-b4d7-451f-a523-ddee419b31c8\" (UID: \"65f33a6a-b4d7-451f-a523-ddee419b31c8\") " Apr 23 13:35:24.438406 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:24.438306 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/65f33a6a-b4d7-451f-a523-ddee419b31c8-image-registry-private-configuration\") pod \"65f33a6a-b4d7-451f-a523-ddee419b31c8\" (UID: \"65f33a6a-b4d7-451f-a523-ddee419b31c8\") " Apr 23 13:35:24.438406 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:24.438332 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/65f33a6a-b4d7-451f-a523-ddee419b31c8-bound-sa-token\") pod \"65f33a6a-b4d7-451f-a523-ddee419b31c8\" (UID: \"65f33a6a-b4d7-451f-a523-ddee419b31c8\") " Apr 23 13:35:24.438406 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:24.438351 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/65f33a6a-b4d7-451f-a523-ddee419b31c8-registry-tls\") pod \"65f33a6a-b4d7-451f-a523-ddee419b31c8\" (UID: \"65f33a6a-b4d7-451f-a523-ddee419b31c8\") " Apr 23 13:35:24.438406 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:24.438384 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/65f33a6a-b4d7-451f-a523-ddee419b31c8-ca-trust-extracted\") pod \"65f33a6a-b4d7-451f-a523-ddee419b31c8\" (UID: \"65f33a6a-b4d7-451f-a523-ddee419b31c8\") " Apr 23 13:35:24.438751 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:24.438542 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fln4h\" (UniqueName: \"kubernetes.io/projected/65f33a6a-b4d7-451f-a523-ddee419b31c8-kube-api-access-fln4h\") pod \"65f33a6a-b4d7-451f-a523-ddee419b31c8\" (UID: \"65f33a6a-b4d7-451f-a523-ddee419b31c8\") " Apr 23 13:35:24.438751 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:24.438652 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65f33a6a-b4d7-451f-a523-ddee419b31c8-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "65f33a6a-b4d7-451f-a523-ddee419b31c8" (UID: "65f33a6a-b4d7-451f-a523-ddee419b31c8"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:35:24.438751 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:24.438711 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65f33a6a-b4d7-451f-a523-ddee419b31c8-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "65f33a6a-b4d7-451f-a523-ddee419b31c8" (UID: "65f33a6a-b4d7-451f-a523-ddee419b31c8"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:35:24.439014 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:24.438988 2576 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/65f33a6a-b4d7-451f-a523-ddee419b31c8-registry-certificates\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 23 13:35:24.439138 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:24.439018 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/65f33a6a-b4d7-451f-a523-ddee419b31c8-trusted-ca\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 23 13:35:24.441329 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:24.441297 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65f33a6a-b4d7-451f-a523-ddee419b31c8-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "65f33a6a-b4d7-451f-a523-ddee419b31c8" (UID: "65f33a6a-b4d7-451f-a523-ddee419b31c8"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:35:24.441463 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:24.441355 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65f33a6a-b4d7-451f-a523-ddee419b31c8-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "65f33a6a-b4d7-451f-a523-ddee419b31c8" (UID: "65f33a6a-b4d7-451f-a523-ddee419b31c8"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:35:24.441463 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:24.441372 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65f33a6a-b4d7-451f-a523-ddee419b31c8-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "65f33a6a-b4d7-451f-a523-ddee419b31c8" (UID: "65f33a6a-b4d7-451f-a523-ddee419b31c8"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:35:24.441463 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:24.441441 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65f33a6a-b4d7-451f-a523-ddee419b31c8-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "65f33a6a-b4d7-451f-a523-ddee419b31c8" (UID: "65f33a6a-b4d7-451f-a523-ddee419b31c8"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:35:24.441582 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:24.441456 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65f33a6a-b4d7-451f-a523-ddee419b31c8-kube-api-access-fln4h" (OuterVolumeSpecName: "kube-api-access-fln4h") pod "65f33a6a-b4d7-451f-a523-ddee419b31c8" (UID: "65f33a6a-b4d7-451f-a523-ddee419b31c8"). InnerVolumeSpecName "kube-api-access-fln4h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:35:24.447294 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:24.447272 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65f33a6a-b4d7-451f-a523-ddee419b31c8-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "65f33a6a-b4d7-451f-a523-ddee419b31c8" (UID: "65f33a6a-b4d7-451f-a523-ddee419b31c8"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:35:24.539838 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:24.539806 2576 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/65f33a6a-b4d7-451f-a523-ddee419b31c8-bound-sa-token\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 23 13:35:24.539838 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:24.539835 2576 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/65f33a6a-b4d7-451f-a523-ddee419b31c8-registry-tls\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 23 13:35:24.539838 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:24.539844 2576 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/65f33a6a-b4d7-451f-a523-ddee419b31c8-ca-trust-extracted\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 23 13:35:24.540052 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:24.539854 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fln4h\" (UniqueName: \"kubernetes.io/projected/65f33a6a-b4d7-451f-a523-ddee419b31c8-kube-api-access-fln4h\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 23 13:35:24.540052 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:24.539866 2576 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/65f33a6a-b4d7-451f-a523-ddee419b31c8-installation-pull-secrets\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 23 13:35:24.540052 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:24.539875 2576 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/65f33a6a-b4d7-451f-a523-ddee419b31c8-image-registry-private-configuration\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 23 13:35:24.842898 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:24.842812 2576 generic.go:358] "Generic (PLEG): container finished" podID="65f33a6a-b4d7-451f-a523-ddee419b31c8" containerID="bf4634eebbd57fc70cc9e1fb8e5e0613bff9d582dbaa5794103c3c4a31881958" exitCode=0 Apr 23 13:35:24.842898 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:24.842875 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5449456df7-4lgzh" Apr 23 13:35:24.842898 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:24.842885 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5449456df7-4lgzh" event={"ID":"65f33a6a-b4d7-451f-a523-ddee419b31c8","Type":"ContainerDied","Data":"bf4634eebbd57fc70cc9e1fb8e5e0613bff9d582dbaa5794103c3c4a31881958"} Apr 23 13:35:24.843149 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:24.842912 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5449456df7-4lgzh" event={"ID":"65f33a6a-b4d7-451f-a523-ddee419b31c8","Type":"ContainerDied","Data":"17ca5c6c30bc8bfc223c93198de1356943fc78b4dbf3067621fa3a1fd8770315"} Apr 23 13:35:24.843149 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:24.842938 2576 scope.go:117] "RemoveContainer" containerID="bf4634eebbd57fc70cc9e1fb8e5e0613bff9d582dbaa5794103c3c4a31881958" Apr 23 13:35:24.864756 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:24.864734 2576 scope.go:117] "RemoveContainer" containerID="bf4634eebbd57fc70cc9e1fb8e5e0613bff9d582dbaa5794103c3c4a31881958" Apr 23 13:35:24.865113 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:35:24.865093 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf4634eebbd57fc70cc9e1fb8e5e0613bff9d582dbaa5794103c3c4a31881958\": container with ID starting with bf4634eebbd57fc70cc9e1fb8e5e0613bff9d582dbaa5794103c3c4a31881958 not found: ID does not exist" containerID="bf4634eebbd57fc70cc9e1fb8e5e0613bff9d582dbaa5794103c3c4a31881958" Apr 23 13:35:24.865163 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:24.865122 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf4634eebbd57fc70cc9e1fb8e5e0613bff9d582dbaa5794103c3c4a31881958"} err="failed to get container status \"bf4634eebbd57fc70cc9e1fb8e5e0613bff9d582dbaa5794103c3c4a31881958\": rpc error: code = NotFound desc = could not find container \"bf4634eebbd57fc70cc9e1fb8e5e0613bff9d582dbaa5794103c3c4a31881958\": container with ID starting with bf4634eebbd57fc70cc9e1fb8e5e0613bff9d582dbaa5794103c3c4a31881958 not found: ID does not exist" Apr 23 13:35:24.865377 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:24.865359 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5449456df7-4lgzh"] Apr 23 13:35:24.869325 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:24.869304 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5449456df7-4lgzh"] Apr 23 13:35:25.159672 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:25.159640 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65f33a6a-b4d7-451f-a523-ddee419b31c8" path="/var/lib/kubelet/pods/65f33a6a-b4d7-451f-a523-ddee419b31c8/volumes" Apr 23 13:35:32.054043 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:32.053977 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6cdf6d9b7-9h84v" podUID="b29902e1-0905-4a3c-99d5-2f5c3a966422" containerName="console" containerID="cri-o://2c89c984b00ff316270306f1ba1bd5636620d35e467aa86ae61a0383954a300e" gracePeriod=15 Apr 23 13:35:32.325338 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:32.325315 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6cdf6d9b7-9h84v_b29902e1-0905-4a3c-99d5-2f5c3a966422/console/0.log" Apr 23 13:35:32.325483 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:32.325381 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cdf6d9b7-9h84v" Apr 23 13:35:32.407640 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:32.407612 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b29902e1-0905-4a3c-99d5-2f5c3a966422-oauth-serving-cert\") pod \"b29902e1-0905-4a3c-99d5-2f5c3a966422\" (UID: \"b29902e1-0905-4a3c-99d5-2f5c3a966422\") " Apr 23 13:35:32.407826 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:32.407678 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b29902e1-0905-4a3c-99d5-2f5c3a966422-service-ca\") pod \"b29902e1-0905-4a3c-99d5-2f5c3a966422\" (UID: \"b29902e1-0905-4a3c-99d5-2f5c3a966422\") " Apr 23 13:35:32.407826 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:32.407715 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b29902e1-0905-4a3c-99d5-2f5c3a966422-console-serving-cert\") pod \"b29902e1-0905-4a3c-99d5-2f5c3a966422\" (UID: \"b29902e1-0905-4a3c-99d5-2f5c3a966422\") " Apr 23 13:35:32.407826 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:32.407740 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npppq\" (UniqueName: \"kubernetes.io/projected/b29902e1-0905-4a3c-99d5-2f5c3a966422-kube-api-access-npppq\") pod \"b29902e1-0905-4a3c-99d5-2f5c3a966422\" (UID: \"b29902e1-0905-4a3c-99d5-2f5c3a966422\") " Apr 23 13:35:32.407826 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:32.407768 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b29902e1-0905-4a3c-99d5-2f5c3a966422-console-oauth-config\") pod \"b29902e1-0905-4a3c-99d5-2f5c3a966422\" (UID: \"b29902e1-0905-4a3c-99d5-2f5c3a966422\") " Apr 23 13:35:32.407826 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:32.407792 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b29902e1-0905-4a3c-99d5-2f5c3a966422-console-config\") pod \"b29902e1-0905-4a3c-99d5-2f5c3a966422\" (UID: \"b29902e1-0905-4a3c-99d5-2f5c3a966422\") " Apr 23 13:35:32.408089 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:32.408061 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b29902e1-0905-4a3c-99d5-2f5c3a966422-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b29902e1-0905-4a3c-99d5-2f5c3a966422" (UID: "b29902e1-0905-4a3c-99d5-2f5c3a966422"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:35:32.408137 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:32.408080 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b29902e1-0905-4a3c-99d5-2f5c3a966422-service-ca" (OuterVolumeSpecName: "service-ca") pod "b29902e1-0905-4a3c-99d5-2f5c3a966422" (UID: "b29902e1-0905-4a3c-99d5-2f5c3a966422"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:35:32.408497 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:32.408465 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b29902e1-0905-4a3c-99d5-2f5c3a966422-console-config" (OuterVolumeSpecName: "console-config") pod "b29902e1-0905-4a3c-99d5-2f5c3a966422" (UID: "b29902e1-0905-4a3c-99d5-2f5c3a966422"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:35:32.410071 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:32.410046 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b29902e1-0905-4a3c-99d5-2f5c3a966422-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b29902e1-0905-4a3c-99d5-2f5c3a966422" (UID: "b29902e1-0905-4a3c-99d5-2f5c3a966422"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:35:32.410170 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:32.410094 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b29902e1-0905-4a3c-99d5-2f5c3a966422-kube-api-access-npppq" (OuterVolumeSpecName: "kube-api-access-npppq") pod "b29902e1-0905-4a3c-99d5-2f5c3a966422" (UID: "b29902e1-0905-4a3c-99d5-2f5c3a966422"). InnerVolumeSpecName "kube-api-access-npppq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:35:32.410170 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:32.410154 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b29902e1-0905-4a3c-99d5-2f5c3a966422-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b29902e1-0905-4a3c-99d5-2f5c3a966422" (UID: "b29902e1-0905-4a3c-99d5-2f5c3a966422"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:35:32.508758 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:32.508724 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b29902e1-0905-4a3c-99d5-2f5c3a966422-service-ca\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 23 13:35:32.508758 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:32.508754 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b29902e1-0905-4a3c-99d5-2f5c3a966422-console-serving-cert\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 23 13:35:32.508758 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:32.508764 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-npppq\" (UniqueName: \"kubernetes.io/projected/b29902e1-0905-4a3c-99d5-2f5c3a966422-kube-api-access-npppq\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 23 13:35:32.508970 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:32.508775 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b29902e1-0905-4a3c-99d5-2f5c3a966422-console-oauth-config\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 23 13:35:32.508970 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:32.508784 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b29902e1-0905-4a3c-99d5-2f5c3a966422-console-config\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 23 13:35:32.508970 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:32.508794 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b29902e1-0905-4a3c-99d5-2f5c3a966422-oauth-serving-cert\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 23 13:35:32.869785 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:32.869758 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6cdf6d9b7-9h84v_b29902e1-0905-4a3c-99d5-2f5c3a966422/console/0.log" Apr 23 13:35:32.869924 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:32.869796 2576 generic.go:358] "Generic (PLEG): container finished" podID="b29902e1-0905-4a3c-99d5-2f5c3a966422" containerID="2c89c984b00ff316270306f1ba1bd5636620d35e467aa86ae61a0383954a300e" exitCode=2 Apr 23 13:35:32.869924 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:32.869829 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cdf6d9b7-9h84v" event={"ID":"b29902e1-0905-4a3c-99d5-2f5c3a966422","Type":"ContainerDied","Data":"2c89c984b00ff316270306f1ba1bd5636620d35e467aa86ae61a0383954a300e"} Apr 23 13:35:32.869924 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:32.869876 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cdf6d9b7-9h84v" event={"ID":"b29902e1-0905-4a3c-99d5-2f5c3a966422","Type":"ContainerDied","Data":"f06f83549d3f57f952f2fa7482c87269407197a2a469fcedf330072d8d71acb5"} Apr 23 13:35:32.869924 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:32.869879 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cdf6d9b7-9h84v" Apr 23 13:35:32.869924 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:32.869898 2576 scope.go:117] "RemoveContainer" containerID="2c89c984b00ff316270306f1ba1bd5636620d35e467aa86ae61a0383954a300e" Apr 23 13:35:32.878595 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:32.878576 2576 scope.go:117] "RemoveContainer" containerID="2c89c984b00ff316270306f1ba1bd5636620d35e467aa86ae61a0383954a300e" Apr 23 13:35:32.878837 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:35:32.878816 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c89c984b00ff316270306f1ba1bd5636620d35e467aa86ae61a0383954a300e\": container with ID starting with 2c89c984b00ff316270306f1ba1bd5636620d35e467aa86ae61a0383954a300e not found: ID does not exist" containerID="2c89c984b00ff316270306f1ba1bd5636620d35e467aa86ae61a0383954a300e" Apr 23 13:35:32.878903 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:32.878850 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c89c984b00ff316270306f1ba1bd5636620d35e467aa86ae61a0383954a300e"} err="failed to get container status \"2c89c984b00ff316270306f1ba1bd5636620d35e467aa86ae61a0383954a300e\": rpc error: code = NotFound desc = could not find container \"2c89c984b00ff316270306f1ba1bd5636620d35e467aa86ae61a0383954a300e\": container with ID starting with 2c89c984b00ff316270306f1ba1bd5636620d35e467aa86ae61a0383954a300e not found: ID does not exist" Apr 23 13:35:32.890104 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:32.890078 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6cdf6d9b7-9h84v"] Apr 23 13:35:32.895407 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:32.895389 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6cdf6d9b7-9h84v"] Apr 23 13:35:33.160344 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:35:33.160314 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b29902e1-0905-4a3c-99d5-2f5c3a966422" path="/var/lib/kubelet/pods/b29902e1-0905-4a3c-99d5-2f5c3a966422/volumes" Apr 23 13:36:01.041318 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:36:01.041278 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0dcb0e48-f774-49cd-8b04-58a5050a5ff2-metrics-certs\") pod \"network-metrics-daemon-564gb\" (UID: \"0dcb0e48-f774-49cd-8b04-58a5050a5ff2\") " pod="openshift-multus/network-metrics-daemon-564gb" Apr 23 13:36:01.043649 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:36:01.043624 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0dcb0e48-f774-49cd-8b04-58a5050a5ff2-metrics-certs\") pod \"network-metrics-daemon-564gb\" (UID: \"0dcb0e48-f774-49cd-8b04-58a5050a5ff2\") " pod="openshift-multus/network-metrics-daemon-564gb" Apr 23 13:36:01.158877 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:36:01.158849 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-rl8cq\"" Apr 23 13:36:01.167529 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:36:01.167510 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-564gb" Apr 23 13:36:01.283434 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:36:01.283379 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-564gb"] Apr 23 13:36:01.286265 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:36:01.286239 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0dcb0e48_f774_49cd_8b04_58a5050a5ff2.slice/crio-897670bf7db5791db0dd0a7f503c449e301a12db2f9ed4d0cb784c3ea10b1790 WatchSource:0}: Error finding container 897670bf7db5791db0dd0a7f503c449e301a12db2f9ed4d0cb784c3ea10b1790: Status 404 returned error can't find the container with id 897670bf7db5791db0dd0a7f503c449e301a12db2f9ed4d0cb784c3ea10b1790 Apr 23 13:36:01.947291 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:36:01.947245 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-564gb" event={"ID":"0dcb0e48-f774-49cd-8b04-58a5050a5ff2","Type":"ContainerStarted","Data":"897670bf7db5791db0dd0a7f503c449e301a12db2f9ed4d0cb784c3ea10b1790"} Apr 23 13:36:02.951071 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:36:02.951037 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-564gb" event={"ID":"0dcb0e48-f774-49cd-8b04-58a5050a5ff2","Type":"ContainerStarted","Data":"9b3539827cd2abfd79ec7756507e686201c5230a621a38424645d32e849cfb3b"} Apr 23 13:36:02.951071 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:36:02.951072 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-564gb" event={"ID":"0dcb0e48-f774-49cd-8b04-58a5050a5ff2","Type":"ContainerStarted","Data":"5d0c8bf005bb957d9ea5029d8576154c5c59bb6b1a7ff98bd938f808403d21ee"} Apr 23 13:36:02.966465 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:36:02.966401 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-564gb" podStartSLOduration=252.921231021 podStartE2EDuration="4m13.966385522s" podCreationTimestamp="2026-04-23 13:31:49 +0000 UTC" firstStartedPulling="2026-04-23 13:36:01.288004047 +0000 UTC m=+252.746463300" lastFinishedPulling="2026-04-23 13:36:02.333158547 +0000 UTC m=+253.791617801" observedRunningTime="2026-04-23 13:36:02.965535072 +0000 UTC m=+254.423994371" watchObservedRunningTime="2026-04-23 13:36:02.966385522 +0000 UTC m=+254.424844796" Apr 23 13:36:49.075154 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:36:49.075122 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-72hmc_edb014da-0558-4a2a-9f98-bea52a2c723e/ovn-acl-logging/0.log" Apr 23 13:36:49.075154 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:36:49.075128 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-72hmc_edb014da-0558-4a2a-9f98-bea52a2c723e/ovn-acl-logging/0.log" Apr 23 13:36:49.081549 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:36:49.081528 2576 kubelet.go:1628] "Image garbage collection succeeded" Apr 23 13:37:16.127715 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:37:16.127680 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lvj79"] Apr 23 13:37:16.130077 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:37:16.127992 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b29902e1-0905-4a3c-99d5-2f5c3a966422" containerName="console" Apr 23 13:37:16.130077 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:37:16.128004 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b29902e1-0905-4a3c-99d5-2f5c3a966422" containerName="console" Apr 23 13:37:16.130077 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:37:16.128024 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="65f33a6a-b4d7-451f-a523-ddee419b31c8" containerName="registry" Apr 23 13:37:16.130077 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:37:16.128030 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="65f33a6a-b4d7-451f-a523-ddee419b31c8" containerName="registry" Apr 23 13:37:16.130077 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:37:16.128075 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="65f33a6a-b4d7-451f-a523-ddee419b31c8" containerName="registry" Apr 23 13:37:16.130077 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:37:16.128083 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="b29902e1-0905-4a3c-99d5-2f5c3a966422" containerName="console" Apr 23 13:37:16.131000 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:37:16.130985 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lvj79" Apr 23 13:37:16.133374 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:37:16.133343 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 23 13:37:16.133527 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:37:16.133405 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 23 13:37:16.133527 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:37:16.133448 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 23 13:37:16.133527 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:37:16.133448 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-t22p4\"" Apr 23 13:37:16.138457 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:37:16.138432 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lvj79"] Apr 23 13:37:16.297138 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:37:16.297106 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/db38055f-f57d-4971-bcc9-6888cb9827c4-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-lvj79\" (UID: \"db38055f-f57d-4971-bcc9-6888cb9827c4\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lvj79" Apr 23 13:37:16.297308 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:37:16.297151 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m99c\" (UniqueName: \"kubernetes.io/projected/db38055f-f57d-4971-bcc9-6888cb9827c4-kube-api-access-2m99c\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-lvj79\" (UID: \"db38055f-f57d-4971-bcc9-6888cb9827c4\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lvj79" Apr 23 13:37:16.398445 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:37:16.398395 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2m99c\" (UniqueName: \"kubernetes.io/projected/db38055f-f57d-4971-bcc9-6888cb9827c4-kube-api-access-2m99c\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-lvj79\" (UID: \"db38055f-f57d-4971-bcc9-6888cb9827c4\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lvj79" Apr 23 13:37:16.398587 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:37:16.398488 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/db38055f-f57d-4971-bcc9-6888cb9827c4-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-lvj79\" (UID: \"db38055f-f57d-4971-bcc9-6888cb9827c4\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lvj79" Apr 23 13:37:16.400692 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:37:16.400666 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/db38055f-f57d-4971-bcc9-6888cb9827c4-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-lvj79\" (UID: \"db38055f-f57d-4971-bcc9-6888cb9827c4\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lvj79" Apr 23 13:37:16.408301 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:37:16.408275 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m99c\" (UniqueName: \"kubernetes.io/projected/db38055f-f57d-4971-bcc9-6888cb9827c4-kube-api-access-2m99c\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-lvj79\" (UID: \"db38055f-f57d-4971-bcc9-6888cb9827c4\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lvj79" Apr 23 13:37:16.442110 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:37:16.442086 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lvj79" Apr 23 13:37:16.565729 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:37:16.565701 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lvj79"] Apr 23 13:37:16.568274 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:37:16.568247 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb38055f_f57d_4971_bcc9_6888cb9827c4.slice/crio-9ac4f6131ca765258af4700e67deb8b8f28968dd18846ee9382bcd8d2882a21c WatchSource:0}: Error finding container 9ac4f6131ca765258af4700e67deb8b8f28968dd18846ee9382bcd8d2882a21c: Status 404 returned error can't find the container with id 9ac4f6131ca765258af4700e67deb8b8f28968dd18846ee9382bcd8d2882a21c Apr 23 13:37:16.570035 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:37:16.570017 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 13:37:17.140922 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:37:17.140865 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lvj79" event={"ID":"db38055f-f57d-4971-bcc9-6888cb9827c4","Type":"ContainerStarted","Data":"9ac4f6131ca765258af4700e67deb8b8f28968dd18846ee9382bcd8d2882a21c"} Apr 23 13:37:20.975993 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:37:20.975962 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-lr7nv"] Apr 23 13:37:20.979238 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:37:20.979221 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-lr7nv" Apr 23 13:37:20.981702 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:37:20.981671 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 23 13:37:20.981702 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:37:20.981694 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 23 13:37:20.981908 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:37:20.981758 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-n5kkt\"" Apr 23 13:37:20.988194 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:37:20.988173 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-lr7nv"] Apr 23 13:37:21.138358 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:37:21.138320 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0ef23222-8d76-40c4-a263-954126c8f342-certificates\") pod \"keda-operator-ffbb595cb-lr7nv\" (UID: \"0ef23222-8d76-40c4-a263-954126c8f342\") " pod="openshift-keda/keda-operator-ffbb595cb-lr7nv" Apr 23 13:37:21.138526 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:37:21.138370 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/0ef23222-8d76-40c4-a263-954126c8f342-cabundle0\") pod \"keda-operator-ffbb595cb-lr7nv\" (UID: \"0ef23222-8d76-40c4-a263-954126c8f342\") " pod="openshift-keda/keda-operator-ffbb595cb-lr7nv" Apr 23 13:37:21.138526 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:37:21.138491 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwsdm\" (UniqueName: \"kubernetes.io/projected/0ef23222-8d76-40c4-a263-954126c8f342-kube-api-access-xwsdm\") pod \"keda-operator-ffbb595cb-lr7nv\" (UID: \"0ef23222-8d76-40c4-a263-954126c8f342\") " pod="openshift-keda/keda-operator-ffbb595cb-lr7nv" Apr 23 13:37:21.160533 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:37:21.160498 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lvj79" event={"ID":"db38055f-f57d-4971-bcc9-6888cb9827c4","Type":"ContainerStarted","Data":"876604ce29b9a49bfe432ea9700b77d2ad1dd06dbb2e97344245b21a59d4f990"} Apr 23 13:37:21.160685 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:37:21.160650 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lvj79" Apr 23 13:37:21.198070 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:37:21.198020 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lvj79" podStartSLOduration=1.353211231 podStartE2EDuration="5.198004048s" podCreationTimestamp="2026-04-23 13:37:16 +0000 UTC" firstStartedPulling="2026-04-23 13:37:16.570144086 +0000 UTC m=+328.028603339" lastFinishedPulling="2026-04-23 13:37:20.414936905 +0000 UTC m=+331.873396156" observedRunningTime="2026-04-23 13:37:21.195557418 +0000 UTC m=+332.654016691" watchObservedRunningTime="2026-04-23 13:37:21.198004048 +0000 UTC m=+332.656463322" Apr 23 13:37:21.239353 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:37:21.239275 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0ef23222-8d76-40c4-a263-954126c8f342-certificates\") pod \"keda-operator-ffbb595cb-lr7nv\" (UID: \"0ef23222-8d76-40c4-a263-954126c8f342\") " pod="openshift-keda/keda-operator-ffbb595cb-lr7nv" Apr 23 13:37:21.239353 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:37:21.239320 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/0ef23222-8d76-40c4-a263-954126c8f342-cabundle0\") pod \"keda-operator-ffbb595cb-lr7nv\" (UID: \"0ef23222-8d76-40c4-a263-954126c8f342\") " pod="openshift-keda/keda-operator-ffbb595cb-lr7nv" Apr 23 13:37:21.239580 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:37:21.239358 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xwsdm\" (UniqueName: \"kubernetes.io/projected/0ef23222-8d76-40c4-a263-954126c8f342-kube-api-access-xwsdm\") pod \"keda-operator-ffbb595cb-lr7nv\" (UID: \"0ef23222-8d76-40c4-a263-954126c8f342\") " pod="openshift-keda/keda-operator-ffbb595cb-lr7nv" Apr 23 13:37:21.239580 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:37:21.239445 2576 secret.go:281] references non-existent secret key: ca.crt Apr 23 13:37:21.239580 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:37:21.239471 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 23 13:37:21.239580 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:37:21.239481 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-lr7nv: references non-existent secret key: ca.crt Apr 23 13:37:21.239580 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:37:21.239539 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0ef23222-8d76-40c4-a263-954126c8f342-certificates podName:0ef23222-8d76-40c4-a263-954126c8f342 nodeName:}" failed. No retries permitted until 2026-04-23 13:37:21.739517767 +0000 UTC m=+333.197977023 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/0ef23222-8d76-40c4-a263-954126c8f342-certificates") pod "keda-operator-ffbb595cb-lr7nv" (UID: "0ef23222-8d76-40c4-a263-954126c8f342") : references non-existent secret key: ca.crt Apr 23 13:37:21.240092 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:37:21.240069 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/0ef23222-8d76-40c4-a263-954126c8f342-cabundle0\") pod \"keda-operator-ffbb595cb-lr7nv\" (UID: \"0ef23222-8d76-40c4-a263-954126c8f342\") " pod="openshift-keda/keda-operator-ffbb595cb-lr7nv" Apr 23 13:37:21.252046 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:37:21.252014 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwsdm\" (UniqueName: \"kubernetes.io/projected/0ef23222-8d76-40c4-a263-954126c8f342-kube-api-access-xwsdm\") pod \"keda-operator-ffbb595cb-lr7nv\" (UID: \"0ef23222-8d76-40c4-a263-954126c8f342\") " pod="openshift-keda/keda-operator-ffbb595cb-lr7nv" Apr 23 13:37:21.744027 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:37:21.743992 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0ef23222-8d76-40c4-a263-954126c8f342-certificates\") pod \"keda-operator-ffbb595cb-lr7nv\" (UID: \"0ef23222-8d76-40c4-a263-954126c8f342\") " pod="openshift-keda/keda-operator-ffbb595cb-lr7nv" Apr 23 13:37:21.744214 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:37:21.744112 2576 secret.go:281] references non-existent secret key: ca.crt Apr 23 13:37:21.744214 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:37:21.744138 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 23 13:37:21.744214 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:37:21.744150 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-lr7nv: references non-existent secret key: ca.crt Apr 23 13:37:21.744214 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:37:21.744214 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0ef23222-8d76-40c4-a263-954126c8f342-certificates podName:0ef23222-8d76-40c4-a263-954126c8f342 nodeName:}" failed. No retries permitted until 2026-04-23 13:37:22.744195351 +0000 UTC m=+334.202654617 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/0ef23222-8d76-40c4-a263-954126c8f342-certificates") pod "keda-operator-ffbb595cb-lr7nv" (UID: "0ef23222-8d76-40c4-a263-954126c8f342") : references non-existent secret key: ca.crt Apr 23 13:37:22.753510 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:37:22.753481 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0ef23222-8d76-40c4-a263-954126c8f342-certificates\") pod \"keda-operator-ffbb595cb-lr7nv\" (UID: \"0ef23222-8d76-40c4-a263-954126c8f342\") " pod="openshift-keda/keda-operator-ffbb595cb-lr7nv" Apr 23 13:37:22.755822 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:37:22.755793 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0ef23222-8d76-40c4-a263-954126c8f342-certificates\") pod \"keda-operator-ffbb595cb-lr7nv\" (UID: \"0ef23222-8d76-40c4-a263-954126c8f342\") " pod="openshift-keda/keda-operator-ffbb595cb-lr7nv" Apr 23 13:37:22.790408 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:37:22.790377 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-lr7nv" Apr 23 13:37:22.913973 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:37:22.913946 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-lr7nv"] Apr 23 13:37:22.916634 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:37:22.916607 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ef23222_8d76_40c4_a263_954126c8f342.slice/crio-e998211c6034f9755ae330d3168c98d3d7cf442dee4174e570f39e498182bf5c WatchSource:0}: Error finding container e998211c6034f9755ae330d3168c98d3d7cf442dee4174e570f39e498182bf5c: Status 404 returned error can't find the container with id e998211c6034f9755ae330d3168c98d3d7cf442dee4174e570f39e498182bf5c Apr 23 13:37:23.169334 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:37:23.169299 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-lr7nv" event={"ID":"0ef23222-8d76-40c4-a263-954126c8f342","Type":"ContainerStarted","Data":"e998211c6034f9755ae330d3168c98d3d7cf442dee4174e570f39e498182bf5c"} Apr 23 13:37:28.184069 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:37:28.184027 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-lr7nv" event={"ID":"0ef23222-8d76-40c4-a263-954126c8f342","Type":"ContainerStarted","Data":"12078613fd4e975dcee4002ccfe0cc2f3e541c6889adc5aa0aa9dff1e1a0d5af"} Apr 23 13:37:28.184453 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:37:28.184093 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-lr7nv" Apr 23 13:37:28.204171 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:37:28.204121 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-lr7nv" podStartSLOduration=3.616104903 podStartE2EDuration="8.204107448s" podCreationTimestamp="2026-04-23 13:37:20 +0000 UTC" firstStartedPulling="2026-04-23 13:37:22.917890888 +0000 UTC m=+334.376350140" lastFinishedPulling="2026-04-23 13:37:27.50589343 +0000 UTC m=+338.964352685" observedRunningTime="2026-04-23 13:37:28.201902779 +0000 UTC m=+339.660362053" watchObservedRunningTime="2026-04-23 13:37:28.204107448 +0000 UTC m=+339.662566722" Apr 23 13:37:42.165277 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:37:42.165202 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-lvj79" Apr 23 13:37:49.188081 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:37:49.188047 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-lr7nv" Apr 23 13:38:29.527010 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:38:29.526976 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-bq6h4"] Apr 23 13:38:29.530310 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:38:29.530289 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-bq6h4" Apr 23 13:38:29.532938 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:38:29.532906 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-p9kcb\"" Apr 23 13:38:29.533062 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:38:29.532945 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 23 13:38:29.534485 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:38:29.534255 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 23 13:38:29.534604 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:38:29.534479 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 23 13:38:29.539505 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:38:29.539476 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-bq6h4"] Apr 23 13:38:29.635043 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:38:29.635009 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x59kr\" (UniqueName: \"kubernetes.io/projected/1361ed5d-a9e7-45a5-bac8-e3f8e6d2ad03-kube-api-access-x59kr\") pod \"llmisvc-controller-manager-68cc5db7c4-bq6h4\" (UID: \"1361ed5d-a9e7-45a5-bac8-e3f8e6d2ad03\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-bq6h4" Apr 23 13:38:29.635202 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:38:29.635054 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1361ed5d-a9e7-45a5-bac8-e3f8e6d2ad03-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-bq6h4\" (UID: \"1361ed5d-a9e7-45a5-bac8-e3f8e6d2ad03\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-bq6h4" Apr 23 13:38:29.736348 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:38:29.736311 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x59kr\" (UniqueName: \"kubernetes.io/projected/1361ed5d-a9e7-45a5-bac8-e3f8e6d2ad03-kube-api-access-x59kr\") pod \"llmisvc-controller-manager-68cc5db7c4-bq6h4\" (UID: \"1361ed5d-a9e7-45a5-bac8-e3f8e6d2ad03\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-bq6h4" Apr 23 13:38:29.736531 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:38:29.736357 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1361ed5d-a9e7-45a5-bac8-e3f8e6d2ad03-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-bq6h4\" (UID: \"1361ed5d-a9e7-45a5-bac8-e3f8e6d2ad03\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-bq6h4" Apr 23 13:38:29.736531 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:38:29.736479 2576 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 23 13:38:29.736619 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:38:29.736562 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1361ed5d-a9e7-45a5-bac8-e3f8e6d2ad03-cert podName:1361ed5d-a9e7-45a5-bac8-e3f8e6d2ad03 nodeName:}" failed. No retries permitted until 2026-04-23 13:38:30.236543331 +0000 UTC m=+401.695002583 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1361ed5d-a9e7-45a5-bac8-e3f8e6d2ad03-cert") pod "llmisvc-controller-manager-68cc5db7c4-bq6h4" (UID: "1361ed5d-a9e7-45a5-bac8-e3f8e6d2ad03") : secret "llmisvc-webhook-server-cert" not found Apr 23 13:38:29.748396 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:38:29.748361 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x59kr\" (UniqueName: \"kubernetes.io/projected/1361ed5d-a9e7-45a5-bac8-e3f8e6d2ad03-kube-api-access-x59kr\") pod \"llmisvc-controller-manager-68cc5db7c4-bq6h4\" (UID: \"1361ed5d-a9e7-45a5-bac8-e3f8e6d2ad03\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-bq6h4" Apr 23 13:38:30.240662 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:38:30.240594 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1361ed5d-a9e7-45a5-bac8-e3f8e6d2ad03-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-bq6h4\" (UID: \"1361ed5d-a9e7-45a5-bac8-e3f8e6d2ad03\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-bq6h4" Apr 23 13:38:30.242957 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:38:30.242926 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1361ed5d-a9e7-45a5-bac8-e3f8e6d2ad03-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-bq6h4\" (UID: \"1361ed5d-a9e7-45a5-bac8-e3f8e6d2ad03\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-bq6h4" Apr 23 13:38:30.447103 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:38:30.447066 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-bq6h4" Apr 23 13:38:30.568433 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:38:30.568312 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-bq6h4"] Apr 23 13:38:30.570990 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:38:30.570965 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod1361ed5d_a9e7_45a5_bac8_e3f8e6d2ad03.slice/crio-d80751074e7d9f2717cf5009bb0d82812988efd2a044c95d8af1e844691eef63 WatchSource:0}: Error finding container d80751074e7d9f2717cf5009bb0d82812988efd2a044c95d8af1e844691eef63: Status 404 returned error can't find the container with id d80751074e7d9f2717cf5009bb0d82812988efd2a044c95d8af1e844691eef63 Apr 23 13:38:31.355870 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:38:31.355832 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-bq6h4" event={"ID":"1361ed5d-a9e7-45a5-bac8-e3f8e6d2ad03","Type":"ContainerStarted","Data":"d80751074e7d9f2717cf5009bb0d82812988efd2a044c95d8af1e844691eef63"} Apr 23 13:38:33.362774 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:38:33.362736 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-bq6h4" event={"ID":"1361ed5d-a9e7-45a5-bac8-e3f8e6d2ad03","Type":"ContainerStarted","Data":"a8cb8305100cd846b59081fd1e0657c702bb8875d452c2fa36dddc9012a4d05a"} Apr 23 13:38:33.363145 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:38:33.362879 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-bq6h4" Apr 23 13:38:33.386536 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:38:33.386488 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-bq6h4" podStartSLOduration=2.306989807 podStartE2EDuration="4.386475921s" podCreationTimestamp="2026-04-23 13:38:29 +0000 UTC" firstStartedPulling="2026-04-23 13:38:30.572463287 +0000 UTC m=+402.030922542" lastFinishedPulling="2026-04-23 13:38:32.651949401 +0000 UTC m=+404.110408656" observedRunningTime="2026-04-23 13:38:33.385838367 +0000 UTC m=+404.844297641" watchObservedRunningTime="2026-04-23 13:38:33.386475921 +0000 UTC m=+404.844935194" Apr 23 13:39:04.368992 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:04.368964 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-bq6h4" Apr 23 13:39:40.201033 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:40.200996 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-wsszv"] Apr 23 13:39:40.204061 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:40.204042 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-wsszv" Apr 23 13:39:40.207601 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:40.207568 2576 status_manager.go:895] "Failed to get status for pod" podUID="a91f57b6-1004-47fe-a0fd-c5630afa5894" pod="kserve/odh-model-controller-696fc77849-wsszv" err="pods \"odh-model-controller-696fc77849-wsszv\" is forbidden: User \"system:node:ip-10-0-134-22.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kserve\": no relationship found between node 'ip-10-0-134-22.ec2.internal' and this object" Apr 23 13:39:40.207601 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:39:40.207577 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"odh-model-controller-dockercfg-48gm8\" is forbidden: User \"system:node:ip-10-0-134-22.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"kserve\": no relationship found between node 'ip-10-0-134-22.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-48gm8\"" type="*v1.Secret" Apr 23 13:39:40.207737 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:39:40.207586 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"odh-model-controller-webhook-cert\" is forbidden: User \"system:node:ip-10-0-134-22.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"kserve\": no relationship found between node 'ip-10-0-134-22.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" type="*v1.Secret" Apr 23 13:39:40.265563 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:40.265536 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-wsszv"] Apr 23 13:39:40.271321 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:40.271292 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wswvn\" (UniqueName: \"kubernetes.io/projected/a91f57b6-1004-47fe-a0fd-c5630afa5894-kube-api-access-wswvn\") pod \"odh-model-controller-696fc77849-wsszv\" (UID: \"a91f57b6-1004-47fe-a0fd-c5630afa5894\") " pod="kserve/odh-model-controller-696fc77849-wsszv" Apr 23 13:39:40.271482 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:40.271340 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a91f57b6-1004-47fe-a0fd-c5630afa5894-cert\") pod \"odh-model-controller-696fc77849-wsszv\" (UID: \"a91f57b6-1004-47fe-a0fd-c5630afa5894\") " pod="kserve/odh-model-controller-696fc77849-wsszv" Apr 23 13:39:40.372137 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:40.372096 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wswvn\" (UniqueName: \"kubernetes.io/projected/a91f57b6-1004-47fe-a0fd-c5630afa5894-kube-api-access-wswvn\") pod \"odh-model-controller-696fc77849-wsszv\" (UID: \"a91f57b6-1004-47fe-a0fd-c5630afa5894\") " pod="kserve/odh-model-controller-696fc77849-wsszv" Apr 23 13:39:40.372305 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:40.372199 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a91f57b6-1004-47fe-a0fd-c5630afa5894-cert\") pod \"odh-model-controller-696fc77849-wsszv\" (UID: \"a91f57b6-1004-47fe-a0fd-c5630afa5894\") " pod="kserve/odh-model-controller-696fc77849-wsszv" Apr 23 13:39:40.385892 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:40.385863 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wswvn\" (UniqueName: \"kubernetes.io/projected/a91f57b6-1004-47fe-a0fd-c5630afa5894-kube-api-access-wswvn\") pod \"odh-model-controller-696fc77849-wsszv\" (UID: \"a91f57b6-1004-47fe-a0fd-c5630afa5894\") " pod="kserve/odh-model-controller-696fc77849-wsszv" Apr 23 13:39:41.373249 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:39:41.373212 2576 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: failed to sync secret cache: timed out waiting for the condition Apr 23 13:39:41.373740 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:39:41.373318 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a91f57b6-1004-47fe-a0fd-c5630afa5894-cert podName:a91f57b6-1004-47fe-a0fd-c5630afa5894 nodeName:}" failed. No retries permitted until 2026-04-23 13:39:41.873294665 +0000 UTC m=+473.331753917 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a91f57b6-1004-47fe-a0fd-c5630afa5894-cert") pod "odh-model-controller-696fc77849-wsszv" (UID: "a91f57b6-1004-47fe-a0fd-c5630afa5894") : failed to sync secret cache: timed out waiting for the condition Apr 23 13:39:41.483592 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:41.483561 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-48gm8\"" Apr 23 13:39:41.800470 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:41.800438 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 23 13:39:41.884290 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:41.884258 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a91f57b6-1004-47fe-a0fd-c5630afa5894-cert\") pod \"odh-model-controller-696fc77849-wsszv\" (UID: \"a91f57b6-1004-47fe-a0fd-c5630afa5894\") " pod="kserve/odh-model-controller-696fc77849-wsszv" Apr 23 13:39:41.886652 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:41.886633 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a91f57b6-1004-47fe-a0fd-c5630afa5894-cert\") pod \"odh-model-controller-696fc77849-wsszv\" (UID: \"a91f57b6-1004-47fe-a0fd-c5630afa5894\") " pod="kserve/odh-model-controller-696fc77849-wsszv" Apr 23 13:39:42.013971 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:42.013930 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-wsszv" Apr 23 13:39:42.154185 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:42.154150 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-wsszv"] Apr 23 13:39:42.157514 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:39:42.157486 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda91f57b6_1004_47fe_a0fd_c5630afa5894.slice/crio-127925b5d5fd436dde6fb0393b9f7f9a61a73c23bbb3866c0d4c4341798d52c8 WatchSource:0}: Error finding container 127925b5d5fd436dde6fb0393b9f7f9a61a73c23bbb3866c0d4c4341798d52c8: Status 404 returned error can't find the container with id 127925b5d5fd436dde6fb0393b9f7f9a61a73c23bbb3866c0d4c4341798d52c8 Apr 23 13:39:42.552928 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:42.552889 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-wsszv" event={"ID":"a91f57b6-1004-47fe-a0fd-c5630afa5894","Type":"ContainerStarted","Data":"127925b5d5fd436dde6fb0393b9f7f9a61a73c23bbb3866c0d4c4341798d52c8"} Apr 23 13:39:45.563370 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:45.563335 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-wsszv" event={"ID":"a91f57b6-1004-47fe-a0fd-c5630afa5894","Type":"ContainerStarted","Data":"2851139e0545f49533a3940480c91863e10df64debf7239950f7e1e119e54be2"} Apr 23 13:39:45.563769 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:45.563397 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-wsszv" Apr 23 13:39:45.590977 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:45.590924 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-wsszv" podStartSLOduration=2.949152684 podStartE2EDuration="5.590907402s" podCreationTimestamp="2026-04-23 13:39:40 +0000 UTC" firstStartedPulling="2026-04-23 13:39:42.158685437 +0000 UTC m=+473.617144688" lastFinishedPulling="2026-04-23 13:39:44.800440145 +0000 UTC m=+476.258899406" observedRunningTime="2026-04-23 13:39:45.589362403 +0000 UTC m=+477.047821678" watchObservedRunningTime="2026-04-23 13:39:45.590907402 +0000 UTC m=+477.049366676" Apr 23 13:39:48.885552 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:48.885514 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-777845d576-zrzmj"] Apr 23 13:39:48.889038 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:48.889018 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-777845d576-zrzmj" Apr 23 13:39:48.891459 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:48.891402 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 23 13:39:48.891595 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:48.891491 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 23 13:39:48.892069 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:48.892053 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-vhfkt\"" Apr 23 13:39:48.892225 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:48.892210 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 23 13:39:48.892297 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:48.892270 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 23 13:39:48.892452 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:48.892436 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 23 13:39:48.897558 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:48.897536 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 23 13:39:48.905182 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:48.905156 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-777845d576-zrzmj"] Apr 23 13:39:49.045385 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:49.045345 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v68k\" (UniqueName: \"kubernetes.io/projected/fcb73fe1-16ee-4ac5-9fc0-4d13e4c91925-kube-api-access-8v68k\") pod \"console-777845d576-zrzmj\" (UID: \"fcb73fe1-16ee-4ac5-9fc0-4d13e4c91925\") " pod="openshift-console/console-777845d576-zrzmj" Apr 23 13:39:49.045589 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:49.045402 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fcb73fe1-16ee-4ac5-9fc0-4d13e4c91925-trusted-ca-bundle\") pod \"console-777845d576-zrzmj\" (UID: \"fcb73fe1-16ee-4ac5-9fc0-4d13e4c91925\") " pod="openshift-console/console-777845d576-zrzmj" Apr 23 13:39:49.045589 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:49.045453 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fcb73fe1-16ee-4ac5-9fc0-4d13e4c91925-console-oauth-config\") pod \"console-777845d576-zrzmj\" (UID: \"fcb73fe1-16ee-4ac5-9fc0-4d13e4c91925\") " pod="openshift-console/console-777845d576-zrzmj" Apr 23 13:39:49.045589 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:49.045492 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fcb73fe1-16ee-4ac5-9fc0-4d13e4c91925-console-config\") pod \"console-777845d576-zrzmj\" (UID: \"fcb73fe1-16ee-4ac5-9fc0-4d13e4c91925\") " pod="openshift-console/console-777845d576-zrzmj" Apr 23 13:39:49.045589 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:49.045533 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fcb73fe1-16ee-4ac5-9fc0-4d13e4c91925-service-ca\") pod \"console-777845d576-zrzmj\" (UID: \"fcb73fe1-16ee-4ac5-9fc0-4d13e4c91925\") " pod="openshift-console/console-777845d576-zrzmj" Apr 23 13:39:49.045589 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:49.045555 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fcb73fe1-16ee-4ac5-9fc0-4d13e4c91925-oauth-serving-cert\") pod \"console-777845d576-zrzmj\" (UID: \"fcb73fe1-16ee-4ac5-9fc0-4d13e4c91925\") " pod="openshift-console/console-777845d576-zrzmj" Apr 23 13:39:49.045759 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:49.045591 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fcb73fe1-16ee-4ac5-9fc0-4d13e4c91925-console-serving-cert\") pod \"console-777845d576-zrzmj\" (UID: \"fcb73fe1-16ee-4ac5-9fc0-4d13e4c91925\") " pod="openshift-console/console-777845d576-zrzmj" Apr 23 13:39:49.146987 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:49.146903 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8v68k\" (UniqueName: \"kubernetes.io/projected/fcb73fe1-16ee-4ac5-9fc0-4d13e4c91925-kube-api-access-8v68k\") pod \"console-777845d576-zrzmj\" (UID: \"fcb73fe1-16ee-4ac5-9fc0-4d13e4c91925\") " pod="openshift-console/console-777845d576-zrzmj" Apr 23 13:39:49.146987 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:49.146951 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fcb73fe1-16ee-4ac5-9fc0-4d13e4c91925-trusted-ca-bundle\") pod \"console-777845d576-zrzmj\" (UID: \"fcb73fe1-16ee-4ac5-9fc0-4d13e4c91925\") " pod="openshift-console/console-777845d576-zrzmj" Apr 23 13:39:49.146987 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:49.146982 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fcb73fe1-16ee-4ac5-9fc0-4d13e4c91925-console-oauth-config\") pod \"console-777845d576-zrzmj\" (UID: \"fcb73fe1-16ee-4ac5-9fc0-4d13e4c91925\") " pod="openshift-console/console-777845d576-zrzmj" Apr 23 13:39:49.147247 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:49.147025 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fcb73fe1-16ee-4ac5-9fc0-4d13e4c91925-console-config\") pod \"console-777845d576-zrzmj\" (UID: \"fcb73fe1-16ee-4ac5-9fc0-4d13e4c91925\") " pod="openshift-console/console-777845d576-zrzmj" Apr 23 13:39:49.147247 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:49.147067 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fcb73fe1-16ee-4ac5-9fc0-4d13e4c91925-service-ca\") pod \"console-777845d576-zrzmj\" (UID: \"fcb73fe1-16ee-4ac5-9fc0-4d13e4c91925\") " pod="openshift-console/console-777845d576-zrzmj" Apr 23 13:39:49.147247 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:49.147092 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fcb73fe1-16ee-4ac5-9fc0-4d13e4c91925-oauth-serving-cert\") pod \"console-777845d576-zrzmj\" (UID: \"fcb73fe1-16ee-4ac5-9fc0-4d13e4c91925\") " pod="openshift-console/console-777845d576-zrzmj" Apr 23 13:39:49.147247 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:49.147130 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fcb73fe1-16ee-4ac5-9fc0-4d13e4c91925-console-serving-cert\") pod \"console-777845d576-zrzmj\" (UID: \"fcb73fe1-16ee-4ac5-9fc0-4d13e4c91925\") " pod="openshift-console/console-777845d576-zrzmj" Apr 23 13:39:49.149352 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:49.149327 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 23 13:39:49.149498 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:49.149478 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 23 13:39:49.149744 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:49.149730 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 23 13:39:49.149958 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:49.149943 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 23 13:39:49.150030 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:49.149980 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 23 13:39:49.155404 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:49.155381 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 23 13:39:49.157812 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:49.157791 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fcb73fe1-16ee-4ac5-9fc0-4d13e4c91925-console-config\") pod \"console-777845d576-zrzmj\" (UID: \"fcb73fe1-16ee-4ac5-9fc0-4d13e4c91925\") " pod="openshift-console/console-777845d576-zrzmj" Apr 23 13:39:49.157912 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:49.157791 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fcb73fe1-16ee-4ac5-9fc0-4d13e4c91925-service-ca\") pod \"console-777845d576-zrzmj\" (UID: \"fcb73fe1-16ee-4ac5-9fc0-4d13e4c91925\") " pod="openshift-console/console-777845d576-zrzmj" Apr 23 13:39:49.158134 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:49.158084 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fcb73fe1-16ee-4ac5-9fc0-4d13e4c91925-oauth-serving-cert\") pod \"console-777845d576-zrzmj\" (UID: \"fcb73fe1-16ee-4ac5-9fc0-4d13e4c91925\") " pod="openshift-console/console-777845d576-zrzmj" Apr 23 13:39:49.158662 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:49.158639 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fcb73fe1-16ee-4ac5-9fc0-4d13e4c91925-trusted-ca-bundle\") pod \"console-777845d576-zrzmj\" (UID: \"fcb73fe1-16ee-4ac5-9fc0-4d13e4c91925\") " pod="openshift-console/console-777845d576-zrzmj" Apr 23 13:39:49.159990 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:49.159941 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v68k\" (UniqueName: \"kubernetes.io/projected/fcb73fe1-16ee-4ac5-9fc0-4d13e4c91925-kube-api-access-8v68k\") pod \"console-777845d576-zrzmj\" (UID: \"fcb73fe1-16ee-4ac5-9fc0-4d13e4c91925\") " pod="openshift-console/console-777845d576-zrzmj" Apr 23 13:39:49.160079 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:49.160004 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fcb73fe1-16ee-4ac5-9fc0-4d13e4c91925-console-serving-cert\") pod \"console-777845d576-zrzmj\" (UID: \"fcb73fe1-16ee-4ac5-9fc0-4d13e4c91925\") " pod="openshift-console/console-777845d576-zrzmj" Apr 23 13:39:49.160534 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:49.160514 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fcb73fe1-16ee-4ac5-9fc0-4d13e4c91925-console-oauth-config\") pod \"console-777845d576-zrzmj\" (UID: \"fcb73fe1-16ee-4ac5-9fc0-4d13e4c91925\") " pod="openshift-console/console-777845d576-zrzmj" Apr 23 13:39:49.201873 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:49.201848 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-vhfkt\"" Apr 23 13:39:49.209651 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:49.209630 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-777845d576-zrzmj" Apr 23 13:39:49.341552 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:49.341525 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-777845d576-zrzmj"] Apr 23 13:39:49.343150 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:39:49.343118 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfcb73fe1_16ee_4ac5_9fc0_4d13e4c91925.slice/crio-cf42327c57c7b24e21931b5b870059934e6f817924cf7bf02ce40e58a5b3d49b WatchSource:0}: Error finding container cf42327c57c7b24e21931b5b870059934e6f817924cf7bf02ce40e58a5b3d49b: Status 404 returned error can't find the container with id cf42327c57c7b24e21931b5b870059934e6f817924cf7bf02ce40e58a5b3d49b Apr 23 13:39:49.578854 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:49.578755 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-777845d576-zrzmj" event={"ID":"fcb73fe1-16ee-4ac5-9fc0-4d13e4c91925","Type":"ContainerStarted","Data":"0fc46f9a24c1174191947d30e6245bef27075d3276b2760ee6d3e49c5c8f819c"} Apr 23 13:39:49.578854 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:49.578797 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-777845d576-zrzmj" event={"ID":"fcb73fe1-16ee-4ac5-9fc0-4d13e4c91925","Type":"ContainerStarted","Data":"cf42327c57c7b24e21931b5b870059934e6f817924cf7bf02ce40e58a5b3d49b"} Apr 23 13:39:49.598888 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:49.598833 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-777845d576-zrzmj" podStartSLOduration=1.598818445 podStartE2EDuration="1.598818445s" podCreationTimestamp="2026-04-23 13:39:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:39:49.59736304 +0000 UTC m=+481.055822326" watchObservedRunningTime="2026-04-23 13:39:49.598818445 +0000 UTC m=+481.057277719" Apr 23 13:39:56.631023 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:56.630987 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-wsszv" Apr 23 13:39:59.209807 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:59.209776 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-777845d576-zrzmj" Apr 23 13:39:59.209807 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:59.209814 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-777845d576-zrzmj" Apr 23 13:39:59.215075 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:59.215050 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-777845d576-zrzmj" Apr 23 13:39:59.612614 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:39:59.612524 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-777845d576-zrzmj" Apr 23 13:41:49.097241 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:41:49.097212 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-72hmc_edb014da-0558-4a2a-9f98-bea52a2c723e/ovn-acl-logging/0.log" Apr 23 13:41:49.098338 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:41:49.098322 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-72hmc_edb014da-0558-4a2a-9f98-bea52a2c723e/ovn-acl-logging/0.log" Apr 23 13:43:43.569780 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:43:43.569744 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-b90a6-98bfc68fb-jbrkv"] Apr 23 13:43:43.572860 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:43:43.572843 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-b90a6-98bfc68fb-jbrkv" Apr 23 13:43:43.574923 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:43:43.574899 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-b90a6-serving-cert\"" Apr 23 13:43:43.575055 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:43:43.574992 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 13:43:43.575565 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:43:43.575550 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-b90a6-kube-rbac-proxy-sar-config\"" Apr 23 13:43:43.575622 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:43:43.575586 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-blqr2\"" Apr 23 13:43:43.580882 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:43:43.580857 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-b90a6-98bfc68fb-jbrkv"] Apr 23 13:43:43.596948 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:43:43.596920 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f7b430e5-15a1-4362-80da-08e5986dcdba-proxy-tls\") pod \"model-chainer-raw-b90a6-98bfc68fb-jbrkv\" (UID: \"f7b430e5-15a1-4362-80da-08e5986dcdba\") " pod="kserve-ci-e2e-test/model-chainer-raw-b90a6-98bfc68fb-jbrkv" Apr 23 13:43:43.597074 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:43:43.596974 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7b430e5-15a1-4362-80da-08e5986dcdba-openshift-service-ca-bundle\") pod \"model-chainer-raw-b90a6-98bfc68fb-jbrkv\" (UID: \"f7b430e5-15a1-4362-80da-08e5986dcdba\") " pod="kserve-ci-e2e-test/model-chainer-raw-b90a6-98bfc68fb-jbrkv" Apr 23 13:43:43.697488 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:43:43.697457 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7b430e5-15a1-4362-80da-08e5986dcdba-openshift-service-ca-bundle\") pod \"model-chainer-raw-b90a6-98bfc68fb-jbrkv\" (UID: \"f7b430e5-15a1-4362-80da-08e5986dcdba\") " pod="kserve-ci-e2e-test/model-chainer-raw-b90a6-98bfc68fb-jbrkv" Apr 23 13:43:43.697649 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:43:43.697549 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f7b430e5-15a1-4362-80da-08e5986dcdba-proxy-tls\") pod \"model-chainer-raw-b90a6-98bfc68fb-jbrkv\" (UID: \"f7b430e5-15a1-4362-80da-08e5986dcdba\") " pod="kserve-ci-e2e-test/model-chainer-raw-b90a6-98bfc68fb-jbrkv" Apr 23 13:43:43.698104 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:43:43.698070 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7b430e5-15a1-4362-80da-08e5986dcdba-openshift-service-ca-bundle\") pod \"model-chainer-raw-b90a6-98bfc68fb-jbrkv\" (UID: \"f7b430e5-15a1-4362-80da-08e5986dcdba\") " pod="kserve-ci-e2e-test/model-chainer-raw-b90a6-98bfc68fb-jbrkv" Apr 23 13:43:43.699930 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:43:43.699909 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f7b430e5-15a1-4362-80da-08e5986dcdba-proxy-tls\") pod \"model-chainer-raw-b90a6-98bfc68fb-jbrkv\" (UID: \"f7b430e5-15a1-4362-80da-08e5986dcdba\") " pod="kserve-ci-e2e-test/model-chainer-raw-b90a6-98bfc68fb-jbrkv" Apr 23 13:43:43.883486 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:43:43.883461 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-b90a6-98bfc68fb-jbrkv" Apr 23 13:43:44.002308 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:43:44.002238 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-b90a6-98bfc68fb-jbrkv"] Apr 23 13:43:44.007493 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:43:44.007449 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7b430e5_15a1_4362_80da_08e5986dcdba.slice/crio-2fd1a5aadd6f9432fedf1b16b92593c9e9212630d8c7fc84b5514433d8002532 WatchSource:0}: Error finding container 2fd1a5aadd6f9432fedf1b16b92593c9e9212630d8c7fc84b5514433d8002532: Status 404 returned error can't find the container with id 2fd1a5aadd6f9432fedf1b16b92593c9e9212630d8c7fc84b5514433d8002532 Apr 23 13:43:44.008989 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:43:44.008972 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 13:43:44.297259 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:43:44.297172 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-b90a6-98bfc68fb-jbrkv" event={"ID":"f7b430e5-15a1-4362-80da-08e5986dcdba","Type":"ContainerStarted","Data":"2fd1a5aadd6f9432fedf1b16b92593c9e9212630d8c7fc84b5514433d8002532"} Apr 23 13:43:47.307286 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:43:47.307250 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-b90a6-98bfc68fb-jbrkv" event={"ID":"f7b430e5-15a1-4362-80da-08e5986dcdba","Type":"ContainerStarted","Data":"b5da76b20a12c20fe459f29310755bcfe1017b8c58d6f1e7c07c83101e041d71"} Apr 23 13:43:47.307753 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:43:47.307374 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-b90a6-98bfc68fb-jbrkv" Apr 23 13:43:47.322761 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:43:47.322446 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-b90a6-98bfc68fb-jbrkv" podStartSLOduration=2.083940019 podStartE2EDuration="4.322431508s" podCreationTimestamp="2026-04-23 13:43:43 +0000 UTC" firstStartedPulling="2026-04-23 13:43:44.009106911 +0000 UTC m=+715.467566180" lastFinishedPulling="2026-04-23 13:43:46.247598399 +0000 UTC m=+717.706057669" observedRunningTime="2026-04-23 13:43:47.321862715 +0000 UTC m=+718.780322012" watchObservedRunningTime="2026-04-23 13:43:47.322431508 +0000 UTC m=+718.780890783" Apr 23 13:43:53.316024 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:43:53.315993 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-raw-b90a6-98bfc68fb-jbrkv" Apr 23 13:43:53.627300 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:43:53.627225 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-b90a6-98bfc68fb-jbrkv"] Apr 23 13:43:53.627470 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:43:53.627442 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-b90a6-98bfc68fb-jbrkv" podUID="f7b430e5-15a1-4362-80da-08e5986dcdba" containerName="model-chainer-raw-b90a6" containerID="cri-o://b5da76b20a12c20fe459f29310755bcfe1017b8c58d6f1e7c07c83101e041d71" gracePeriod=30 Apr 23 13:43:58.314652 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:43:58.314610 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-b90a6-98bfc68fb-jbrkv" podUID="f7b430e5-15a1-4362-80da-08e5986dcdba" containerName="model-chainer-raw-b90a6" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:44:03.314296 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:44:03.314253 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-b90a6-98bfc68fb-jbrkv" podUID="f7b430e5-15a1-4362-80da-08e5986dcdba" containerName="model-chainer-raw-b90a6" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:44:08.314088 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:44:08.314053 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-b90a6-98bfc68fb-jbrkv" podUID="f7b430e5-15a1-4362-80da-08e5986dcdba" containerName="model-chainer-raw-b90a6" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:44:08.314579 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:44:08.314145 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-b90a6-98bfc68fb-jbrkv" Apr 23 13:44:13.314017 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:44:13.313973 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-b90a6-98bfc68fb-jbrkv" podUID="f7b430e5-15a1-4362-80da-08e5986dcdba" containerName="model-chainer-raw-b90a6" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:44:18.314170 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:44:18.314121 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-b90a6-98bfc68fb-jbrkv" podUID="f7b430e5-15a1-4362-80da-08e5986dcdba" containerName="model-chainer-raw-b90a6" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:44:23.314574 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:44:23.314535 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-b90a6-98bfc68fb-jbrkv" podUID="f7b430e5-15a1-4362-80da-08e5986dcdba" containerName="model-chainer-raw-b90a6" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:44:23.765171 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:44:23.765148 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-b90a6-98bfc68fb-jbrkv" Apr 23 13:44:23.922043 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:44:23.922010 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f7b430e5-15a1-4362-80da-08e5986dcdba-proxy-tls\") pod \"f7b430e5-15a1-4362-80da-08e5986dcdba\" (UID: \"f7b430e5-15a1-4362-80da-08e5986dcdba\") " Apr 23 13:44:23.922224 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:44:23.922074 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7b430e5-15a1-4362-80da-08e5986dcdba-openshift-service-ca-bundle\") pod \"f7b430e5-15a1-4362-80da-08e5986dcdba\" (UID: \"f7b430e5-15a1-4362-80da-08e5986dcdba\") " Apr 23 13:44:23.922472 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:44:23.922449 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7b430e5-15a1-4362-80da-08e5986dcdba-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "f7b430e5-15a1-4362-80da-08e5986dcdba" (UID: "f7b430e5-15a1-4362-80da-08e5986dcdba"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:44:23.924164 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:44:23.924144 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7b430e5-15a1-4362-80da-08e5986dcdba-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f7b430e5-15a1-4362-80da-08e5986dcdba" (UID: "f7b430e5-15a1-4362-80da-08e5986dcdba"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:44:24.022957 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:44:24.022907 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f7b430e5-15a1-4362-80da-08e5986dcdba-proxy-tls\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 23 13:44:24.022957 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:44:24.022951 2576 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7b430e5-15a1-4362-80da-08e5986dcdba-openshift-service-ca-bundle\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 23 13:44:24.417841 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:44:24.417809 2576 generic.go:358] "Generic (PLEG): container finished" podID="f7b430e5-15a1-4362-80da-08e5986dcdba" containerID="b5da76b20a12c20fe459f29310755bcfe1017b8c58d6f1e7c07c83101e041d71" exitCode=0 Apr 23 13:44:24.418287 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:44:24.417882 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-b90a6-98bfc68fb-jbrkv" Apr 23 13:44:24.418287 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:44:24.417906 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-b90a6-98bfc68fb-jbrkv" event={"ID":"f7b430e5-15a1-4362-80da-08e5986dcdba","Type":"ContainerDied","Data":"b5da76b20a12c20fe459f29310755bcfe1017b8c58d6f1e7c07c83101e041d71"} Apr 23 13:44:24.418287 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:44:24.417946 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-b90a6-98bfc68fb-jbrkv" event={"ID":"f7b430e5-15a1-4362-80da-08e5986dcdba","Type":"ContainerDied","Data":"2fd1a5aadd6f9432fedf1b16b92593c9e9212630d8c7fc84b5514433d8002532"} Apr 23 13:44:24.418287 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:44:24.417962 2576 scope.go:117] "RemoveContainer" containerID="b5da76b20a12c20fe459f29310755bcfe1017b8c58d6f1e7c07c83101e041d71" Apr 23 13:44:24.427209 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:44:24.427191 2576 scope.go:117] "RemoveContainer" containerID="b5da76b20a12c20fe459f29310755bcfe1017b8c58d6f1e7c07c83101e041d71" Apr 23 13:44:24.427475 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:44:24.427452 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5da76b20a12c20fe459f29310755bcfe1017b8c58d6f1e7c07c83101e041d71\": container with ID starting with b5da76b20a12c20fe459f29310755bcfe1017b8c58d6f1e7c07c83101e041d71 not found: ID does not exist" containerID="b5da76b20a12c20fe459f29310755bcfe1017b8c58d6f1e7c07c83101e041d71" Apr 23 13:44:24.427559 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:44:24.427487 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5da76b20a12c20fe459f29310755bcfe1017b8c58d6f1e7c07c83101e041d71"} err="failed to get container status \"b5da76b20a12c20fe459f29310755bcfe1017b8c58d6f1e7c07c83101e041d71\": rpc error: code = NotFound desc = could not find container \"b5da76b20a12c20fe459f29310755bcfe1017b8c58d6f1e7c07c83101e041d71\": container with ID starting with b5da76b20a12c20fe459f29310755bcfe1017b8c58d6f1e7c07c83101e041d71 not found: ID does not exist" Apr 23 13:44:24.440993 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:44:24.440960 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-b90a6-98bfc68fb-jbrkv"] Apr 23 13:44:24.444596 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:44:24.444572 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-b90a6-98bfc68fb-jbrkv"] Apr 23 13:44:25.160557 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:44:25.160524 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7b430e5-15a1-4362-80da-08e5986dcdba" path="/var/lib/kubelet/pods/f7b430e5-15a1-4362-80da-08e5986dcdba/volumes" Apr 23 13:45:23.931806 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:45:23.931775 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-4f8db-7f97674dc6-bphnl"] Apr 23 13:45:23.932250 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:45:23.932122 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f7b430e5-15a1-4362-80da-08e5986dcdba" containerName="model-chainer-raw-b90a6" Apr 23 13:45:23.932250 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:45:23.932134 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7b430e5-15a1-4362-80da-08e5986dcdba" containerName="model-chainer-raw-b90a6" Apr 23 13:45:23.932250 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:45:23.932195 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="f7b430e5-15a1-4362-80da-08e5986dcdba" containerName="model-chainer-raw-b90a6" Apr 23 13:45:23.935169 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:45:23.935151 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4f8db-7f97674dc6-bphnl" Apr 23 13:45:23.937521 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:45:23.937499 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-4f8db-kube-rbac-proxy-sar-config\"" Apr 23 13:45:23.937521 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:45:23.937508 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 13:45:23.937782 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:45:23.937766 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-blqr2\"" Apr 23 13:45:23.938245 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:45:23.938227 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-4f8db-serving-cert\"" Apr 23 13:45:23.946477 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:45:23.946456 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-4f8db-7f97674dc6-bphnl"] Apr 23 13:45:24.008799 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:45:24.008766 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdffad98-406e-4b80-ab90-90779209ae8c-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-4f8db-7f97674dc6-bphnl\" (UID: \"cdffad98-406e-4b80-ab90-90779209ae8c\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4f8db-7f97674dc6-bphnl" Apr 23 13:45:24.008947 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:45:24.008821 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cdffad98-406e-4b80-ab90-90779209ae8c-proxy-tls\") pod \"model-chainer-raw-hpa-4f8db-7f97674dc6-bphnl\" (UID: \"cdffad98-406e-4b80-ab90-90779209ae8c\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4f8db-7f97674dc6-bphnl" Apr 23 13:45:24.109810 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:45:24.109781 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdffad98-406e-4b80-ab90-90779209ae8c-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-4f8db-7f97674dc6-bphnl\" (UID: \"cdffad98-406e-4b80-ab90-90779209ae8c\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4f8db-7f97674dc6-bphnl" Apr 23 13:45:24.109936 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:45:24.109839 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cdffad98-406e-4b80-ab90-90779209ae8c-proxy-tls\") pod \"model-chainer-raw-hpa-4f8db-7f97674dc6-bphnl\" (UID: \"cdffad98-406e-4b80-ab90-90779209ae8c\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4f8db-7f97674dc6-bphnl" Apr 23 13:45:24.109980 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:45:24.109959 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/model-chainer-raw-hpa-4f8db-serving-cert: secret "model-chainer-raw-hpa-4f8db-serving-cert" not found Apr 23 13:45:24.110049 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:45:24.110038 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cdffad98-406e-4b80-ab90-90779209ae8c-proxy-tls podName:cdffad98-406e-4b80-ab90-90779209ae8c nodeName:}" failed. No retries permitted until 2026-04-23 13:45:24.610015025 +0000 UTC m=+816.068474277 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/cdffad98-406e-4b80-ab90-90779209ae8c-proxy-tls") pod "model-chainer-raw-hpa-4f8db-7f97674dc6-bphnl" (UID: "cdffad98-406e-4b80-ab90-90779209ae8c") : secret "model-chainer-raw-hpa-4f8db-serving-cert" not found Apr 23 13:45:24.110585 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:45:24.110566 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdffad98-406e-4b80-ab90-90779209ae8c-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-4f8db-7f97674dc6-bphnl\" (UID: \"cdffad98-406e-4b80-ab90-90779209ae8c\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4f8db-7f97674dc6-bphnl" Apr 23 13:45:24.615145 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:45:24.615110 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cdffad98-406e-4b80-ab90-90779209ae8c-proxy-tls\") pod \"model-chainer-raw-hpa-4f8db-7f97674dc6-bphnl\" (UID: \"cdffad98-406e-4b80-ab90-90779209ae8c\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4f8db-7f97674dc6-bphnl" Apr 23 13:45:24.617462 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:45:24.617437 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cdffad98-406e-4b80-ab90-90779209ae8c-proxy-tls\") pod \"model-chainer-raw-hpa-4f8db-7f97674dc6-bphnl\" (UID: \"cdffad98-406e-4b80-ab90-90779209ae8c\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4f8db-7f97674dc6-bphnl" Apr 23 13:45:24.845907 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:45:24.845868 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4f8db-7f97674dc6-bphnl" Apr 23 13:45:24.967957 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:45:24.967929 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-4f8db-7f97674dc6-bphnl"] Apr 23 13:45:24.969690 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:45:24.969657 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdffad98_406e_4b80_ab90_90779209ae8c.slice/crio-f90ca9ece6bacb1c1dc988cabda544521113058b8ad2bb242d2b70bf075007fd WatchSource:0}: Error finding container f90ca9ece6bacb1c1dc988cabda544521113058b8ad2bb242d2b70bf075007fd: Status 404 returned error can't find the container with id f90ca9ece6bacb1c1dc988cabda544521113058b8ad2bb242d2b70bf075007fd Apr 23 13:45:25.617834 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:45:25.617800 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4f8db-7f97674dc6-bphnl" event={"ID":"cdffad98-406e-4b80-ab90-90779209ae8c","Type":"ContainerStarted","Data":"770d0d6b956e822e8582ef15a30e88ea52bfa35ddcb9b3829d5a13952454851c"} Apr 23 13:45:25.617834 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:45:25.617839 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4f8db-7f97674dc6-bphnl" event={"ID":"cdffad98-406e-4b80-ab90-90779209ae8c","Type":"ContainerStarted","Data":"f90ca9ece6bacb1c1dc988cabda544521113058b8ad2bb242d2b70bf075007fd"} Apr 23 13:45:25.618035 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:45:25.617918 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4f8db-7f97674dc6-bphnl" Apr 23 13:45:25.638733 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:45:25.638685 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4f8db-7f97674dc6-bphnl" podStartSLOduration=2.638671188 podStartE2EDuration="2.638671188s" podCreationTimestamp="2026-04-23 13:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:45:25.636901537 +0000 UTC m=+817.095360811" watchObservedRunningTime="2026-04-23 13:45:25.638671188 +0000 UTC m=+817.097130503" Apr 23 13:45:31.626260 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:45:31.626228 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4f8db-7f97674dc6-bphnl" Apr 23 13:45:34.047282 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:45:34.047249 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-4f8db-7f97674dc6-bphnl"] Apr 23 13:45:34.047670 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:45:34.047500 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4f8db-7f97674dc6-bphnl" podUID="cdffad98-406e-4b80-ab90-90779209ae8c" containerName="model-chainer-raw-hpa-4f8db" containerID="cri-o://770d0d6b956e822e8582ef15a30e88ea52bfa35ddcb9b3829d5a13952454851c" gracePeriod=30 Apr 23 13:45:36.625942 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:45:36.625893 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4f8db-7f97674dc6-bphnl" podUID="cdffad98-406e-4b80-ab90-90779209ae8c" containerName="model-chainer-raw-hpa-4f8db" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:45:41.625146 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:45:41.625109 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4f8db-7f97674dc6-bphnl" podUID="cdffad98-406e-4b80-ab90-90779209ae8c" containerName="model-chainer-raw-hpa-4f8db" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:45:46.624967 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:45:46.624915 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4f8db-7f97674dc6-bphnl" podUID="cdffad98-406e-4b80-ab90-90779209ae8c" containerName="model-chainer-raw-hpa-4f8db" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:45:46.625454 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:45:46.625033 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4f8db-7f97674dc6-bphnl" Apr 23 13:45:51.624715 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:45:51.624673 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4f8db-7f97674dc6-bphnl" podUID="cdffad98-406e-4b80-ab90-90779209ae8c" containerName="model-chainer-raw-hpa-4f8db" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:45:56.625859 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:45:56.625810 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4f8db-7f97674dc6-bphnl" podUID="cdffad98-406e-4b80-ab90-90779209ae8c" containerName="model-chainer-raw-hpa-4f8db" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:46:01.625494 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:46:01.625456 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4f8db-7f97674dc6-bphnl" podUID="cdffad98-406e-4b80-ab90-90779209ae8c" containerName="model-chainer-raw-hpa-4f8db" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:46:04.072709 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:46:04.072677 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdffad98_406e_4b80_ab90_90779209ae8c.slice/crio-770d0d6b956e822e8582ef15a30e88ea52bfa35ddcb9b3829d5a13952454851c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdffad98_406e_4b80_ab90_90779209ae8c.slice/crio-conmon-770d0d6b956e822e8582ef15a30e88ea52bfa35ddcb9b3829d5a13952454851c.scope\": RecentStats: unable to find data in memory cache]" Apr 23 13:46:04.073021 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:46:04.072964 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdffad98_406e_4b80_ab90_90779209ae8c.slice/crio-conmon-770d0d6b956e822e8582ef15a30e88ea52bfa35ddcb9b3829d5a13952454851c.scope\": RecentStats: unable to find data in memory cache]" Apr 23 13:46:04.204925 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:46:04.204898 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4f8db-7f97674dc6-bphnl" Apr 23 13:46:04.319625 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:46:04.319542 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cdffad98-406e-4b80-ab90-90779209ae8c-proxy-tls\") pod \"cdffad98-406e-4b80-ab90-90779209ae8c\" (UID: \"cdffad98-406e-4b80-ab90-90779209ae8c\") " Apr 23 13:46:04.319625 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:46:04.319621 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdffad98-406e-4b80-ab90-90779209ae8c-openshift-service-ca-bundle\") pod \"cdffad98-406e-4b80-ab90-90779209ae8c\" (UID: \"cdffad98-406e-4b80-ab90-90779209ae8c\") " Apr 23 13:46:04.319996 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:46:04.319965 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdffad98-406e-4b80-ab90-90779209ae8c-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "cdffad98-406e-4b80-ab90-90779209ae8c" (UID: "cdffad98-406e-4b80-ab90-90779209ae8c"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:46:04.321275 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:46:04.321251 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdffad98-406e-4b80-ab90-90779209ae8c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "cdffad98-406e-4b80-ab90-90779209ae8c" (UID: "cdffad98-406e-4b80-ab90-90779209ae8c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:46:04.420582 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:46:04.420550 2576 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdffad98-406e-4b80-ab90-90779209ae8c-openshift-service-ca-bundle\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 23 13:46:04.420582 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:46:04.420575 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cdffad98-406e-4b80-ab90-90779209ae8c-proxy-tls\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 23 13:46:04.739196 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:46:04.739156 2576 generic.go:358] "Generic (PLEG): container finished" podID="cdffad98-406e-4b80-ab90-90779209ae8c" containerID="770d0d6b956e822e8582ef15a30e88ea52bfa35ddcb9b3829d5a13952454851c" exitCode=0 Apr 23 13:46:04.739354 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:46:04.739214 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4f8db-7f97674dc6-bphnl" event={"ID":"cdffad98-406e-4b80-ab90-90779209ae8c","Type":"ContainerDied","Data":"770d0d6b956e822e8582ef15a30e88ea52bfa35ddcb9b3829d5a13952454851c"} Apr 23 13:46:04.739354 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:46:04.739247 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4f8db-7f97674dc6-bphnl" event={"ID":"cdffad98-406e-4b80-ab90-90779209ae8c","Type":"ContainerDied","Data":"f90ca9ece6bacb1c1dc988cabda544521113058b8ad2bb242d2b70bf075007fd"} Apr 23 13:46:04.739354 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:46:04.739263 2576 scope.go:117] "RemoveContainer" containerID="770d0d6b956e822e8582ef15a30e88ea52bfa35ddcb9b3829d5a13952454851c" Apr 23 13:46:04.739354 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:46:04.739223 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4f8db-7f97674dc6-bphnl" Apr 23 13:46:04.747929 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:46:04.747798 2576 scope.go:117] "RemoveContainer" containerID="770d0d6b956e822e8582ef15a30e88ea52bfa35ddcb9b3829d5a13952454851c" Apr 23 13:46:04.748166 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:46:04.748136 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"770d0d6b956e822e8582ef15a30e88ea52bfa35ddcb9b3829d5a13952454851c\": container with ID starting with 770d0d6b956e822e8582ef15a30e88ea52bfa35ddcb9b3829d5a13952454851c not found: ID does not exist" containerID="770d0d6b956e822e8582ef15a30e88ea52bfa35ddcb9b3829d5a13952454851c" Apr 23 13:46:04.748265 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:46:04.748175 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"770d0d6b956e822e8582ef15a30e88ea52bfa35ddcb9b3829d5a13952454851c"} err="failed to get container status \"770d0d6b956e822e8582ef15a30e88ea52bfa35ddcb9b3829d5a13952454851c\": rpc error: code = NotFound desc = could not find container \"770d0d6b956e822e8582ef15a30e88ea52bfa35ddcb9b3829d5a13952454851c\": container with ID starting with 770d0d6b956e822e8582ef15a30e88ea52bfa35ddcb9b3829d5a13952454851c not found: ID does not exist" Apr 23 13:46:04.767836 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:46:04.767810 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-4f8db-7f97674dc6-bphnl"] Apr 23 13:46:04.781320 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:46:04.781294 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-4f8db-7f97674dc6-bphnl"] Apr 23 13:46:05.159813 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:46:05.159785 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdffad98-406e-4b80-ab90-90779209ae8c" path="/var/lib/kubelet/pods/cdffad98-406e-4b80-ab90-90779209ae8c/volumes" Apr 23 13:46:49.117860 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:46:49.117825 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-72hmc_edb014da-0558-4a2a-9f98-bea52a2c723e/ovn-acl-logging/0.log" Apr 23 13:46:49.119600 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:46:49.119579 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-72hmc_edb014da-0558-4a2a-9f98-bea52a2c723e/ovn-acl-logging/0.log" Apr 23 13:51:49.137987 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:51:49.137958 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-72hmc_edb014da-0558-4a2a-9f98-bea52a2c723e/ovn-acl-logging/0.log" Apr 23 13:51:49.140774 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:51:49.140751 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-72hmc_edb014da-0558-4a2a-9f98-bea52a2c723e/ovn-acl-logging/0.log" Apr 23 13:54:18.015554 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:18.015517 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6ff9q/must-gather-h9g2f"] Apr 23 13:54:18.016044 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:18.015825 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cdffad98-406e-4b80-ab90-90779209ae8c" containerName="model-chainer-raw-hpa-4f8db" Apr 23 13:54:18.016044 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:18.015837 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdffad98-406e-4b80-ab90-90779209ae8c" containerName="model-chainer-raw-hpa-4f8db" Apr 23 13:54:18.016044 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:18.015916 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="cdffad98-406e-4b80-ab90-90779209ae8c" containerName="model-chainer-raw-hpa-4f8db" Apr 23 13:54:18.018790 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:18.018772 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6ff9q/must-gather-h9g2f" Apr 23 13:54:18.021121 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:18.021094 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6ff9q\"/\"openshift-service-ca.crt\"" Apr 23 13:54:18.021241 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:18.021133 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6ff9q\"/\"kube-root-ca.crt\"" Apr 23 13:54:18.021241 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:18.021224 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-6ff9q\"/\"default-dockercfg-vkn8w\"" Apr 23 13:54:18.026467 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:18.026444 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6ff9q/must-gather-h9g2f"] Apr 23 13:54:18.089621 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:18.089589 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/db1c7585-fda9-40e6-8559-e9dd2e4a6789-must-gather-output\") pod \"must-gather-h9g2f\" (UID: \"db1c7585-fda9-40e6-8559-e9dd2e4a6789\") " pod="openshift-must-gather-6ff9q/must-gather-h9g2f" Apr 23 13:54:18.089801 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:18.089642 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq6xj\" (UniqueName: \"kubernetes.io/projected/db1c7585-fda9-40e6-8559-e9dd2e4a6789-kube-api-access-mq6xj\") pod \"must-gather-h9g2f\" (UID: \"db1c7585-fda9-40e6-8559-e9dd2e4a6789\") " pod="openshift-must-gather-6ff9q/must-gather-h9g2f" Apr 23 13:54:18.190597 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:18.190546 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/db1c7585-fda9-40e6-8559-e9dd2e4a6789-must-gather-output\") pod \"must-gather-h9g2f\" (UID: \"db1c7585-fda9-40e6-8559-e9dd2e4a6789\") " pod="openshift-must-gather-6ff9q/must-gather-h9g2f" Apr 23 13:54:18.190772 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:18.190617 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mq6xj\" (UniqueName: \"kubernetes.io/projected/db1c7585-fda9-40e6-8559-e9dd2e4a6789-kube-api-access-mq6xj\") pod \"must-gather-h9g2f\" (UID: \"db1c7585-fda9-40e6-8559-e9dd2e4a6789\") " pod="openshift-must-gather-6ff9q/must-gather-h9g2f" Apr 23 13:54:18.190891 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:18.190872 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/db1c7585-fda9-40e6-8559-e9dd2e4a6789-must-gather-output\") pod \"must-gather-h9g2f\" (UID: \"db1c7585-fda9-40e6-8559-e9dd2e4a6789\") " pod="openshift-must-gather-6ff9q/must-gather-h9g2f" Apr 23 13:54:18.198210 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:18.198180 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq6xj\" (UniqueName: \"kubernetes.io/projected/db1c7585-fda9-40e6-8559-e9dd2e4a6789-kube-api-access-mq6xj\") pod \"must-gather-h9g2f\" (UID: \"db1c7585-fda9-40e6-8559-e9dd2e4a6789\") " pod="openshift-must-gather-6ff9q/must-gather-h9g2f" Apr 23 13:54:18.328292 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:18.328197 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6ff9q/must-gather-h9g2f" Apr 23 13:54:18.448901 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:18.448876 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6ff9q/must-gather-h9g2f"] Apr 23 13:54:18.450584 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:54:18.450554 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb1c7585_fda9_40e6_8559_e9dd2e4a6789.slice/crio-6cde1d94d277913d42cb7bae81853b477e975c6b1d7ca01365ec0699c1bd092d WatchSource:0}: Error finding container 6cde1d94d277913d42cb7bae81853b477e975c6b1d7ca01365ec0699c1bd092d: Status 404 returned error can't find the container with id 6cde1d94d277913d42cb7bae81853b477e975c6b1d7ca01365ec0699c1bd092d Apr 23 13:54:18.452637 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:18.452620 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 13:54:19.296965 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:19.296929 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6ff9q/must-gather-h9g2f" event={"ID":"db1c7585-fda9-40e6-8559-e9dd2e4a6789","Type":"ContainerStarted","Data":"6cde1d94d277913d42cb7bae81853b477e975c6b1d7ca01365ec0699c1bd092d"} Apr 23 13:54:23.310701 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:23.310668 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6ff9q/must-gather-h9g2f" event={"ID":"db1c7585-fda9-40e6-8559-e9dd2e4a6789","Type":"ContainerStarted","Data":"8741097e70b408afbcf60ba7e6362452a36a535fdffdc3f1d803f15a8b759aff"} Apr 23 13:54:24.317352 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:24.317307 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6ff9q/must-gather-h9g2f" event={"ID":"db1c7585-fda9-40e6-8559-e9dd2e4a6789","Type":"ContainerStarted","Data":"a5a8b153d5f40376b358e8bc3d29ce680d8a7255e5e5574c4841320620aa1b47"} Apr 23 13:54:24.334242 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:24.334179 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6ff9q/must-gather-h9g2f" podStartSLOduration=2.608829842 podStartE2EDuration="7.334163846s" podCreationTimestamp="2026-04-23 13:54:17 +0000 UTC" firstStartedPulling="2026-04-23 13:54:18.452764478 +0000 UTC m=+1349.911223730" lastFinishedPulling="2026-04-23 13:54:23.178098478 +0000 UTC m=+1354.636557734" observedRunningTime="2026-04-23 13:54:24.332769031 +0000 UTC m=+1355.791228299" watchObservedRunningTime="2026-04-23 13:54:24.334163846 +0000 UTC m=+1355.792623120" Apr 23 13:54:41.376369 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:41.376282 2576 generic.go:358] "Generic (PLEG): container finished" podID="db1c7585-fda9-40e6-8559-e9dd2e4a6789" containerID="8741097e70b408afbcf60ba7e6362452a36a535fdffdc3f1d803f15a8b759aff" exitCode=0 Apr 23 13:54:41.376369 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:41.376342 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6ff9q/must-gather-h9g2f" event={"ID":"db1c7585-fda9-40e6-8559-e9dd2e4a6789","Type":"ContainerDied","Data":"8741097e70b408afbcf60ba7e6362452a36a535fdffdc3f1d803f15a8b759aff"} Apr 23 13:54:41.376805 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:41.376685 2576 scope.go:117] "RemoveContainer" containerID="8741097e70b408afbcf60ba7e6362452a36a535fdffdc3f1d803f15a8b759aff" Apr 23 13:54:41.982182 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:41.982153 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6ff9q_must-gather-h9g2f_db1c7585-fda9-40e6-8559-e9dd2e4a6789/gather/0.log" Apr 23 13:54:45.329267 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:45.329233 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-kjxcs_2d75111d-f6a4-4ae1-8c0b-e20c6a9b9515/global-pull-secret-syncer/0.log" Apr 23 13:54:45.437430 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:45.437393 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-62dc8_349fe628-9a5e-4f45-bd89-4d75157af516/konnectivity-agent/0.log" Apr 23 13:54:45.550782 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:45.550753 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-134-22.ec2.internal_1b90ee820fd4186f1e6cd40d24ef3276/haproxy/0.log" Apr 23 13:54:47.449929 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:47.449899 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6ff9q/must-gather-h9g2f"] Apr 23 13:54:47.450396 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:47.450116 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-6ff9q/must-gather-h9g2f" podUID="db1c7585-fda9-40e6-8559-e9dd2e4a6789" containerName="copy" containerID="cri-o://a5a8b153d5f40376b358e8bc3d29ce680d8a7255e5e5574c4841320620aa1b47" gracePeriod=2 Apr 23 13:54:47.454250 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:47.454117 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6ff9q/must-gather-h9g2f"] Apr 23 13:54:47.672839 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:47.672815 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6ff9q_must-gather-h9g2f_db1c7585-fda9-40e6-8559-e9dd2e4a6789/copy/0.log" Apr 23 13:54:47.673163 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:47.673147 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6ff9q/must-gather-h9g2f" Apr 23 13:54:47.675055 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:47.675026 2576 status_manager.go:895] "Failed to get status for pod" podUID="db1c7585-fda9-40e6-8559-e9dd2e4a6789" pod="openshift-must-gather-6ff9q/must-gather-h9g2f" err="pods \"must-gather-h9g2f\" is forbidden: User \"system:node:ip-10-0-134-22.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-6ff9q\": no relationship found between node 'ip-10-0-134-22.ec2.internal' and this object" Apr 23 13:54:47.848019 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:47.847932 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/db1c7585-fda9-40e6-8559-e9dd2e4a6789-must-gather-output\") pod \"db1c7585-fda9-40e6-8559-e9dd2e4a6789\" (UID: \"db1c7585-fda9-40e6-8559-e9dd2e4a6789\") " Apr 23 13:54:47.848019 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:47.847982 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mq6xj\" (UniqueName: \"kubernetes.io/projected/db1c7585-fda9-40e6-8559-e9dd2e4a6789-kube-api-access-mq6xj\") pod \"db1c7585-fda9-40e6-8559-e9dd2e4a6789\" (UID: \"db1c7585-fda9-40e6-8559-e9dd2e4a6789\") " Apr 23 13:54:47.849335 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:47.849305 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db1c7585-fda9-40e6-8559-e9dd2e4a6789-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "db1c7585-fda9-40e6-8559-e9dd2e4a6789" (UID: "db1c7585-fda9-40e6-8559-e9dd2e4a6789"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:54:47.850225 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:47.850200 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db1c7585-fda9-40e6-8559-e9dd2e4a6789-kube-api-access-mq6xj" (OuterVolumeSpecName: "kube-api-access-mq6xj") pod "db1c7585-fda9-40e6-8559-e9dd2e4a6789" (UID: "db1c7585-fda9-40e6-8559-e9dd2e4a6789"). InnerVolumeSpecName "kube-api-access-mq6xj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:54:47.949134 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:47.949105 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mq6xj\" (UniqueName: \"kubernetes.io/projected/db1c7585-fda9-40e6-8559-e9dd2e4a6789-kube-api-access-mq6xj\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 23 13:54:47.949134 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:47.949130 2576 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/db1c7585-fda9-40e6-8559-e9dd2e4a6789-must-gather-output\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 23 13:54:48.399655 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:48.399623 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6ff9q_must-gather-h9g2f_db1c7585-fda9-40e6-8559-e9dd2e4a6789/copy/0.log" Apr 23 13:54:48.399978 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:48.399954 2576 generic.go:358] "Generic (PLEG): container finished" podID="db1c7585-fda9-40e6-8559-e9dd2e4a6789" containerID="a5a8b153d5f40376b358e8bc3d29ce680d8a7255e5e5574c4841320620aa1b47" exitCode=143 Apr 23 13:54:48.400049 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:48.400015 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6ff9q/must-gather-h9g2f" Apr 23 13:54:48.400049 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:48.400046 2576 scope.go:117] "RemoveContainer" containerID="a5a8b153d5f40376b358e8bc3d29ce680d8a7255e5e5574c4841320620aa1b47" Apr 23 13:54:48.402359 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:48.402330 2576 status_manager.go:895] "Failed to get status for pod" podUID="db1c7585-fda9-40e6-8559-e9dd2e4a6789" pod="openshift-must-gather-6ff9q/must-gather-h9g2f" err="pods \"must-gather-h9g2f\" is forbidden: User \"system:node:ip-10-0-134-22.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-6ff9q\": no relationship found between node 'ip-10-0-134-22.ec2.internal' and this object" Apr 23 13:54:48.407947 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:48.407700 2576 scope.go:117] "RemoveContainer" containerID="8741097e70b408afbcf60ba7e6362452a36a535fdffdc3f1d803f15a8b759aff" Apr 23 13:54:48.412545 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:48.412520 2576 status_manager.go:895] "Failed to get status for pod" podUID="db1c7585-fda9-40e6-8559-e9dd2e4a6789" pod="openshift-must-gather-6ff9q/must-gather-h9g2f" err="pods \"must-gather-h9g2f\" is forbidden: User \"system:node:ip-10-0-134-22.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-6ff9q\": no relationship found between node 'ip-10-0-134-22.ec2.internal' and this object" Apr 23 13:54:48.421689 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:48.421675 2576 scope.go:117] "RemoveContainer" containerID="a5a8b153d5f40376b358e8bc3d29ce680d8a7255e5e5574c4841320620aa1b47" Apr 23 13:54:48.421946 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:54:48.421927 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5a8b153d5f40376b358e8bc3d29ce680d8a7255e5e5574c4841320620aa1b47\": container with ID starting with a5a8b153d5f40376b358e8bc3d29ce680d8a7255e5e5574c4841320620aa1b47 not found: ID does not exist" containerID="a5a8b153d5f40376b358e8bc3d29ce680d8a7255e5e5574c4841320620aa1b47" Apr 23 13:54:48.421989 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:48.421953 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5a8b153d5f40376b358e8bc3d29ce680d8a7255e5e5574c4841320620aa1b47"} err="failed to get container status \"a5a8b153d5f40376b358e8bc3d29ce680d8a7255e5e5574c4841320620aa1b47\": rpc error: code = NotFound desc = could not find container \"a5a8b153d5f40376b358e8bc3d29ce680d8a7255e5e5574c4841320620aa1b47\": container with ID starting with a5a8b153d5f40376b358e8bc3d29ce680d8a7255e5e5574c4841320620aa1b47 not found: ID does not exist" Apr 23 13:54:48.421989 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:48.421974 2576 scope.go:117] "RemoveContainer" containerID="8741097e70b408afbcf60ba7e6362452a36a535fdffdc3f1d803f15a8b759aff" Apr 23 13:54:48.422200 ip-10-0-134-22 kubenswrapper[2576]: E0423 13:54:48.422186 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8741097e70b408afbcf60ba7e6362452a36a535fdffdc3f1d803f15a8b759aff\": container with ID starting with 8741097e70b408afbcf60ba7e6362452a36a535fdffdc3f1d803f15a8b759aff not found: ID does not exist" containerID="8741097e70b408afbcf60ba7e6362452a36a535fdffdc3f1d803f15a8b759aff" Apr 23 13:54:48.422244 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:48.422204 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8741097e70b408afbcf60ba7e6362452a36a535fdffdc3f1d803f15a8b759aff"} err="failed to get container status \"8741097e70b408afbcf60ba7e6362452a36a535fdffdc3f1d803f15a8b759aff\": rpc error: code = NotFound desc = could not find container \"8741097e70b408afbcf60ba7e6362452a36a535fdffdc3f1d803f15a8b759aff\": container with ID starting with 8741097e70b408afbcf60ba7e6362452a36a535fdffdc3f1d803f15a8b759aff not found: ID does not exist" Apr 23 13:54:48.876043 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:48.875968 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-jbdzg_e382350e-9e57-4746-b19b-b8c4c615a7b2/kube-state-metrics/0.log" Apr 23 13:54:48.897323 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:48.897301 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-jbdzg_e382350e-9e57-4746-b19b-b8c4c615a7b2/kube-rbac-proxy-main/0.log" Apr 23 13:54:48.920006 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:48.919983 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-jbdzg_e382350e-9e57-4746-b19b-b8c4c615a7b2/kube-rbac-proxy-self/0.log" Apr 23 13:54:48.949986 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:48.949966 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-cf57fc8d5-52tld_6dbda42f-9298-42c4-8400-79899f5a6c90/metrics-server/0.log" Apr 23 13:54:49.093007 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:49.092977 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-mfnwd_a08bb45f-4b1e-4c3f-98f3-d171b1c3212c/node-exporter/0.log" Apr 23 13:54:49.118966 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:49.118943 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-mfnwd_a08bb45f-4b1e-4c3f-98f3-d171b1c3212c/kube-rbac-proxy/0.log" Apr 23 13:54:49.145513 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:49.145493 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-mfnwd_a08bb45f-4b1e-4c3f-98f3-d171b1c3212c/init-textfile/0.log" Apr 23 13:54:49.159193 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:49.159160 2576 status_manager.go:895] "Failed to get status for pod" podUID="db1c7585-fda9-40e6-8559-e9dd2e4a6789" pod="openshift-must-gather-6ff9q/must-gather-h9g2f" err="pods \"must-gather-h9g2f\" is forbidden: User \"system:node:ip-10-0-134-22.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-6ff9q\": no relationship found between node 'ip-10-0-134-22.ec2.internal' and this object" Apr 23 13:54:49.160000 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:49.159980 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db1c7585-fda9-40e6-8559-e9dd2e4a6789" path="/var/lib/kubelet/pods/db1c7585-fda9-40e6-8559-e9dd2e4a6789/volumes" Apr 23 13:54:49.249946 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:49.249916 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-dgg7g_bfaad655-80b2-4abf-9055-f34b4dc51fc8/kube-rbac-proxy-main/0.log" Apr 23 13:54:49.273243 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:49.273220 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-dgg7g_bfaad655-80b2-4abf-9055-f34b4dc51fc8/kube-rbac-proxy-self/0.log" Apr 23 13:54:49.297596 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:49.297572 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-dgg7g_bfaad655-80b2-4abf-9055-f34b4dc51fc8/openshift-state-metrics/0.log" Apr 23 13:54:49.661120 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:49.661090 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-77c79b9cd5-w4mpm_0c9fffe4-836e-4eb1-ab91-fe058d9c2765/thanos-query/0.log" Apr 23 13:54:49.684706 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:49.684685 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-77c79b9cd5-w4mpm_0c9fffe4-836e-4eb1-ab91-fe058d9c2765/kube-rbac-proxy-web/0.log" Apr 23 13:54:49.707074 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:49.707050 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-77c79b9cd5-w4mpm_0c9fffe4-836e-4eb1-ab91-fe058d9c2765/kube-rbac-proxy/0.log" Apr 23 13:54:49.729497 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:49.729471 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-77c79b9cd5-w4mpm_0c9fffe4-836e-4eb1-ab91-fe058d9c2765/prom-label-proxy/0.log" Apr 23 13:54:49.755655 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:49.755629 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-77c79b9cd5-w4mpm_0c9fffe4-836e-4eb1-ab91-fe058d9c2765/kube-rbac-proxy-rules/0.log" Apr 23 13:54:49.778939 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:49.778916 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-77c79b9cd5-w4mpm_0c9fffe4-836e-4eb1-ab91-fe058d9c2765/kube-rbac-proxy-metrics/0.log" Apr 23 13:54:51.725050 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:51.725023 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-777845d576-zrzmj_fcb73fe1-16ee-4ac5-9fc0-4d13e4c91925/console/0.log" Apr 23 13:54:51.753872 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:51.753845 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-f9vzx_b53af7fd-f48d-4cee-aa0a-867edf1e7051/download-server/0.log" Apr 23 13:54:52.422205 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:52.422175 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6sl86/perf-node-gather-daemonset-bxx5x"] Apr 23 13:54:52.422501 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:52.422488 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="db1c7585-fda9-40e6-8559-e9dd2e4a6789" containerName="copy" Apr 23 13:54:52.422501 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:52.422502 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="db1c7585-fda9-40e6-8559-e9dd2e4a6789" containerName="copy" Apr 23 13:54:52.422594 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:52.422513 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="db1c7585-fda9-40e6-8559-e9dd2e4a6789" containerName="gather" Apr 23 13:54:52.422594 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:52.422518 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="db1c7585-fda9-40e6-8559-e9dd2e4a6789" containerName="gather" Apr 23 13:54:52.422660 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:52.422598 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="db1c7585-fda9-40e6-8559-e9dd2e4a6789" containerName="copy" Apr 23 13:54:52.422660 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:52.422609 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="db1c7585-fda9-40e6-8559-e9dd2e4a6789" containerName="gather" Apr 23 13:54:52.427906 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:52.427884 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-bxx5x" Apr 23 13:54:52.430202 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:52.430179 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6sl86\"/\"openshift-service-ca.crt\"" Apr 23 13:54:52.431056 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:52.431030 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-6sl86\"/\"default-dockercfg-dr2cm\"" Apr 23 13:54:52.431056 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:52.431047 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6sl86\"/\"kube-root-ca.crt\"" Apr 23 13:54:52.435339 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:52.435267 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6sl86/perf-node-gather-daemonset-bxx5x"] Apr 23 13:54:52.583542 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:52.583508 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/827166ea-4dfd-4ddb-b777-e5bbb4066993-proc\") pod \"perf-node-gather-daemonset-bxx5x\" (UID: \"827166ea-4dfd-4ddb-b777-e5bbb4066993\") " pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-bxx5x" Apr 23 13:54:52.583722 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:52.583551 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/827166ea-4dfd-4ddb-b777-e5bbb4066993-lib-modules\") pod \"perf-node-gather-daemonset-bxx5x\" (UID: \"827166ea-4dfd-4ddb-b777-e5bbb4066993\") " pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-bxx5x" Apr 23 13:54:52.583722 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:52.583575 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/827166ea-4dfd-4ddb-b777-e5bbb4066993-sys\") pod \"perf-node-gather-daemonset-bxx5x\" (UID: \"827166ea-4dfd-4ddb-b777-e5bbb4066993\") " pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-bxx5x" Apr 23 13:54:52.583722 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:52.583646 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/827166ea-4dfd-4ddb-b777-e5bbb4066993-podres\") pod \"perf-node-gather-daemonset-bxx5x\" (UID: \"827166ea-4dfd-4ddb-b777-e5bbb4066993\") " pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-bxx5x" Apr 23 13:54:52.583828 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:52.583743 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84rcn\" (UniqueName: \"kubernetes.io/projected/827166ea-4dfd-4ddb-b777-e5bbb4066993-kube-api-access-84rcn\") pod \"perf-node-gather-daemonset-bxx5x\" (UID: \"827166ea-4dfd-4ddb-b777-e5bbb4066993\") " pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-bxx5x" Apr 23 13:54:52.684776 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:52.684696 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84rcn\" (UniqueName: \"kubernetes.io/projected/827166ea-4dfd-4ddb-b777-e5bbb4066993-kube-api-access-84rcn\") pod \"perf-node-gather-daemonset-bxx5x\" (UID: \"827166ea-4dfd-4ddb-b777-e5bbb4066993\") " pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-bxx5x" Apr 23 13:54:52.684776 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:52.684735 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/827166ea-4dfd-4ddb-b777-e5bbb4066993-proc\") pod \"perf-node-gather-daemonset-bxx5x\" (UID: \"827166ea-4dfd-4ddb-b777-e5bbb4066993\") " pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-bxx5x" Apr 23 13:54:52.684776 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:52.684759 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/827166ea-4dfd-4ddb-b777-e5bbb4066993-lib-modules\") pod \"perf-node-gather-daemonset-bxx5x\" (UID: \"827166ea-4dfd-4ddb-b777-e5bbb4066993\") " pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-bxx5x" Apr 23 13:54:52.685011 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:52.684781 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/827166ea-4dfd-4ddb-b777-e5bbb4066993-sys\") pod \"perf-node-gather-daemonset-bxx5x\" (UID: \"827166ea-4dfd-4ddb-b777-e5bbb4066993\") " pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-bxx5x" Apr 23 13:54:52.685011 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:52.684798 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/827166ea-4dfd-4ddb-b777-e5bbb4066993-podres\") pod \"perf-node-gather-daemonset-bxx5x\" (UID: \"827166ea-4dfd-4ddb-b777-e5bbb4066993\") " pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-bxx5x" Apr 23 13:54:52.685011 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:52.684841 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/827166ea-4dfd-4ddb-b777-e5bbb4066993-proc\") pod \"perf-node-gather-daemonset-bxx5x\" (UID: \"827166ea-4dfd-4ddb-b777-e5bbb4066993\") " pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-bxx5x" Apr 23 13:54:52.685011 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:52.684885 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/827166ea-4dfd-4ddb-b777-e5bbb4066993-sys\") pod \"perf-node-gather-daemonset-bxx5x\" (UID: \"827166ea-4dfd-4ddb-b777-e5bbb4066993\") " pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-bxx5x" Apr 23 13:54:52.685011 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:52.684921 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/827166ea-4dfd-4ddb-b777-e5bbb4066993-podres\") pod \"perf-node-gather-daemonset-bxx5x\" (UID: \"827166ea-4dfd-4ddb-b777-e5bbb4066993\") " pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-bxx5x" Apr 23 13:54:52.685011 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:52.684937 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/827166ea-4dfd-4ddb-b777-e5bbb4066993-lib-modules\") pod \"perf-node-gather-daemonset-bxx5x\" (UID: \"827166ea-4dfd-4ddb-b777-e5bbb4066993\") " pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-bxx5x" Apr 23 13:54:52.692370 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:52.692346 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-84rcn\" (UniqueName: \"kubernetes.io/projected/827166ea-4dfd-4ddb-b777-e5bbb4066993-kube-api-access-84rcn\") pod \"perf-node-gather-daemonset-bxx5x\" (UID: \"827166ea-4dfd-4ddb-b777-e5bbb4066993\") " pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-bxx5x" Apr 23 13:54:52.738101 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:52.738071 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-bxx5x" Apr 23 13:54:52.816255 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:52.816166 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-294bt_85ae1732-8725-4694-959f-e9e424548aca/dns/0.log" Apr 23 13:54:52.838708 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:52.838684 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-294bt_85ae1732-8725-4694-959f-e9e424548aca/kube-rbac-proxy/0.log" Apr 23 13:54:52.857607 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:52.857585 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6sl86/perf-node-gather-daemonset-bxx5x"] Apr 23 13:54:52.860536 ip-10-0-134-22 kubenswrapper[2576]: W0423 13:54:52.860510 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod827166ea_4dfd_4ddb_b777_e5bbb4066993.slice/crio-e164127b9809c38dc8330c380da4a60537d8f4c224560824be7d561d26a81d40 WatchSource:0}: Error finding container e164127b9809c38dc8330c380da4a60537d8f4c224560824be7d561d26a81d40: Status 404 returned error can't find the container with id e164127b9809c38dc8330c380da4a60537d8f4c224560824be7d561d26a81d40 Apr 23 13:54:52.947377 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:52.947357 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-cj9dc_8ddfabc2-1040-4841-9473-ed5ba1c0c775/dns-node-resolver/0.log" Apr 23 13:54:53.392061 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:53.392035 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-jfnjq_098e4208-e230-428a-af72-f1aa64c09ce0/node-ca/0.log" Apr 23 13:54:53.415681 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:53.415650 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-bxx5x" event={"ID":"827166ea-4dfd-4ddb-b777-e5bbb4066993","Type":"ContainerStarted","Data":"dbe85c24f6eaf9576369dc771a79aa694ba97a0055554da653090987e7cdd672"} Apr 23 13:54:53.415681 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:53.415684 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-bxx5x" event={"ID":"827166ea-4dfd-4ddb-b777-e5bbb4066993","Type":"ContainerStarted","Data":"e164127b9809c38dc8330c380da4a60537d8f4c224560824be7d561d26a81d40"} Apr 23 13:54:53.415890 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:53.415738 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-bxx5x" Apr 23 13:54:53.433322 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:53.433267 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-bxx5x" podStartSLOduration=1.433253706 podStartE2EDuration="1.433253706s" podCreationTimestamp="2026-04-23 13:54:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:54:53.43118539 +0000 UTC m=+1384.889644664" watchObservedRunningTime="2026-04-23 13:54:53.433253706 +0000 UTC m=+1384.891712979" Apr 23 13:54:54.429020 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:54.428974 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-x77wr_37a73ca9-a37a-469b-8043-50d6b6f5ae10/serve-healthcheck-canary/0.log" Apr 23 13:54:54.859678 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:54.859597 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rb4nw_abf891cd-631a-4743-8e63-cfa25d73e6c1/kube-rbac-proxy/0.log" Apr 23 13:54:54.879844 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:54.879823 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rb4nw_abf891cd-631a-4743-8e63-cfa25d73e6c1/exporter/0.log" Apr 23 13:54:54.899927 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:54.899900 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rb4nw_abf891cd-631a-4743-8e63-cfa25d73e6c1/extractor/0.log" Apr 23 13:54:56.853306 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:56.853281 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-bq6h4_1361ed5d-a9e7-45a5-bac8-e3f8e6d2ad03/manager/0.log" Apr 23 13:54:56.957578 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:56.957545 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-wsszv_a91f57b6-1004-47fe-a0fd-c5630afa5894/manager/0.log" Apr 23 13:54:59.428638 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:54:59.428606 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-6sl86/perf-node-gather-daemonset-bxx5x" Apr 23 13:55:00.405240 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:55:00.405207 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-sfkld_16dee9ba-f0e8-474c-8023-1231f9070d98/migrator/0.log" Apr 23 13:55:00.425528 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:55:00.425507 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-sfkld_16dee9ba-f0e8-474c-8023-1231f9070d98/graceful-termination/0.log" Apr 23 13:55:01.849971 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:55:01.849897 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-275jz_55a0340a-0400-482b-8422-3e0465f0802d/kube-multus/0.log" Apr 23 13:55:01.903200 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:55:01.903166 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-57l24_dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd/kube-multus-additional-cni-plugins/0.log" Apr 23 13:55:01.927531 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:55:01.927499 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-57l24_dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd/egress-router-binary-copy/0.log" Apr 23 13:55:01.949293 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:55:01.949269 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-57l24_dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd/cni-plugins/0.log" Apr 23 13:55:01.969440 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:55:01.969402 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-57l24_dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd/bond-cni-plugin/0.log" Apr 23 13:55:01.991191 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:55:01.991174 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-57l24_dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd/routeoverride-cni/0.log" Apr 23 13:55:02.012792 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:55:02.012769 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-57l24_dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd/whereabouts-cni-bincopy/0.log" Apr 23 13:55:02.036845 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:55:02.036814 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-57l24_dc7fc0e6-0ff2-4765-b7da-e0e49fb829cd/whereabouts-cni/0.log" Apr 23 13:55:02.411935 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:55:02.411901 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-564gb_0dcb0e48-f774-49cd-8b04-58a5050a5ff2/network-metrics-daemon/0.log" Apr 23 13:55:02.431135 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:55:02.431111 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-564gb_0dcb0e48-f774-49cd-8b04-58a5050a5ff2/kube-rbac-proxy/0.log" Apr 23 13:55:03.577075 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:55:03.576985 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-72hmc_edb014da-0558-4a2a-9f98-bea52a2c723e/ovn-controller/0.log" Apr 23 13:55:03.599534 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:55:03.599505 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-72hmc_edb014da-0558-4a2a-9f98-bea52a2c723e/ovn-acl-logging/0.log" Apr 23 13:55:03.605626 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:55:03.605606 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-72hmc_edb014da-0558-4a2a-9f98-bea52a2c723e/ovn-acl-logging/1.log" Apr 23 13:55:03.626468 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:55:03.626448 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-72hmc_edb014da-0558-4a2a-9f98-bea52a2c723e/kube-rbac-proxy-node/0.log" Apr 23 13:55:03.648735 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:55:03.648713 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-72hmc_edb014da-0558-4a2a-9f98-bea52a2c723e/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 13:55:03.669206 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:55:03.669184 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-72hmc_edb014da-0558-4a2a-9f98-bea52a2c723e/northd/0.log" Apr 23 13:55:03.689394 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:55:03.689373 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-72hmc_edb014da-0558-4a2a-9f98-bea52a2c723e/nbdb/0.log" Apr 23 13:55:03.713576 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:55:03.713552 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-72hmc_edb014da-0558-4a2a-9f98-bea52a2c723e/sbdb/0.log" Apr 23 13:55:03.802049 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:55:03.802019 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-72hmc_edb014da-0558-4a2a-9f98-bea52a2c723e/ovnkube-controller/0.log" Apr 23 13:55:05.011292 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:55:05.011260 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-dggd8_ab515304-a26a-4e7e-bfeb-cc0ca3a93c8e/network-check-target-container/0.log" Apr 23 13:55:05.966319 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:55:05.966289 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-66br2_324be42f-87e9-413c-a39b-1c5ebac3ad6d/iptables-alerter/0.log" Apr 23 13:55:06.651034 ip-10-0-134-22 kubenswrapper[2576]: I0423 13:55:06.651000 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-zzbbp_d9ada073-20d9-454e-b803-aef6be2e17c7/tuned/0.log"