Apr 21 03:57:36.199649 ip-10-0-134-136 systemd[1]: Starting Kubernetes Kubelet... Apr 21 03:57:36.681542 ip-10-0-134-136 kubenswrapper[2578]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 03:57:36.681542 ip-10-0-134-136 kubenswrapper[2578]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 03:57:36.681542 ip-10-0-134-136 kubenswrapper[2578]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 03:57:36.681542 ip-10-0-134-136 kubenswrapper[2578]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 03:57:36.681542 ip-10-0-134-136 kubenswrapper[2578]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 03:57:36.683145 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.683042 2578 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 03:57:36.685760 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685738 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 03:57:36.685760 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685757 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 03:57:36.685760 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685763 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 03:57:36.685760 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685767 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 03:57:36.686005 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685771 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 03:57:36.686005 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685776 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 03:57:36.686005 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685779 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 03:57:36.686005 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685783 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 03:57:36.686005 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685787 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 03:57:36.686005 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685797 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 03:57:36.686005 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685802 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 03:57:36.686005 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685806 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 03:57:36.686005 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685809 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 03:57:36.686005 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685813 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 03:57:36.686005 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685817 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 03:57:36.686005 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685821 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 21 03:57:36.686005 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685825 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 03:57:36.686005 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685830 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 03:57:36.686005 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685834 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 03:57:36.686005 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685838 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 03:57:36.686005 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685842 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 03:57:36.686005 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685846 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 03:57:36.686005 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685850 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 03:57:36.686005 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685854 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 03:57:36.686772 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685858 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 03:57:36.686772 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685862 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 03:57:36.686772 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685866 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 03:57:36.686772 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685870 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 03:57:36.686772 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685877 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 03:57:36.686772 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685883 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 03:57:36.686772 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685888 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 03:57:36.686772 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685892 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 03:57:36.686772 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685896 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 03:57:36.686772 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685902 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 03:57:36.686772 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685906 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 03:57:36.686772 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685910 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 03:57:36.686772 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685914 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 03:57:36.686772 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685921 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 03:57:36.686772 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685926 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 03:57:36.686772 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685930 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 03:57:36.686772 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685934 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 03:57:36.686772 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685939 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 03:57:36.686772 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685943 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 03:57:36.686772 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685947 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 03:57:36.687275 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685952 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 03:57:36.687275 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685956 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 03:57:36.687275 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685960 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 03:57:36.687275 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685963 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 03:57:36.687275 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685968 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 03:57:36.687275 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685972 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 03:57:36.687275 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685976 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 03:57:36.687275 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685980 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 03:57:36.687275 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685984 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 03:57:36.687275 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685989 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 03:57:36.687275 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685993 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 03:57:36.687275 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.685997 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 03:57:36.687275 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.686001 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 03:57:36.687275 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.686005 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 03:57:36.687275 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.686009 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 03:57:36.687275 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.686014 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 03:57:36.687275 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.686018 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 03:57:36.687275 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.686022 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 03:57:36.687275 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.686028 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 03:57:36.687771 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.686032 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 03:57:36.687771 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.686037 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 03:57:36.687771 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.686041 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 03:57:36.687771 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.686045 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 03:57:36.687771 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.686049 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 03:57:36.687771 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.686053 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 03:57:36.687771 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.686058 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 03:57:36.687771 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.686062 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 03:57:36.687771 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.686067 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 03:57:36.687771 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.686071 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 03:57:36.687771 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.686075 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 03:57:36.687771 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.686079 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 03:57:36.687771 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.686083 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 03:57:36.687771 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.686089 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 03:57:36.687771 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.686096 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 03:57:36.687771 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.686100 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 03:57:36.687771 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.686104 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 03:57:36.687771 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.686109 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 03:57:36.687771 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.686112 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 03:57:36.688244 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.686116 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 03:57:36.688244 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.686120 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 03:57:36.688244 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.686124 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 03:57:36.688244 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.686129 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 03:57:36.688841 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.688825 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 03:57:36.688841 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.688841 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 03:57:36.688954 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.688846 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 03:57:36.688954 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.688851 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 03:57:36.688954 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.688856 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 03:57:36.688954 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.688862 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 03:57:36.688954 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.688866 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 03:57:36.688954 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.688871 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 03:57:36.688954 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.688876 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 03:57:36.688954 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.688880 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 03:57:36.688954 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.688885 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 03:57:36.688954 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.688892 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 03:57:36.688954 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.688898 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 03:57:36.688954 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.688903 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 03:57:36.688954 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.688907 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 03:57:36.688954 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.688911 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 03:57:36.688954 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.688914 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 03:57:36.688954 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.688919 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 03:57:36.688954 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.688923 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 03:57:36.688954 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.688927 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 03:57:36.688954 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.688931 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 03:57:36.689654 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.688935 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 03:57:36.689654 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.688939 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 03:57:36.689654 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.688943 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 03:57:36.689654 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.688947 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 03:57:36.689654 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.688951 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 03:57:36.689654 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.688955 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 21 03:57:36.689654 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.688959 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 03:57:36.689654 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.688963 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 03:57:36.689654 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.688967 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 03:57:36.689654 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.688972 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 03:57:36.689654 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.688975 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 03:57:36.689654 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.688980 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 03:57:36.689654 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.688984 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 03:57:36.689654 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.688988 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 03:57:36.689654 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.688994 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 03:57:36.689654 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.688998 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 03:57:36.689654 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.689002 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 03:57:36.689654 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.689007 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 03:57:36.689654 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.689011 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 03:57:36.690125 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.689015 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 03:57:36.690125 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.689018 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 03:57:36.690125 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.689022 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 03:57:36.690125 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.689026 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 03:57:36.690125 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.689032 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 03:57:36.690125 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.689038 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 03:57:36.690125 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.689042 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 03:57:36.690125 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.689046 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 03:57:36.690125 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.689050 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 03:57:36.690125 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.689053 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 03:57:36.690125 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.689057 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 03:57:36.690125 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.689061 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 03:57:36.690125 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.689066 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 03:57:36.690125 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.689070 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 03:57:36.690125 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.689075 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 03:57:36.690125 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.689079 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 03:57:36.690125 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.689083 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 03:57:36.690125 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.689087 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 03:57:36.690125 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.689091 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 03:57:36.690622 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.689097 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 03:57:36.690622 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.689101 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 03:57:36.690622 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.689105 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 03:57:36.690622 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.689109 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 03:57:36.690622 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.689113 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 03:57:36.690622 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.689117 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 03:57:36.690622 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.689121 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 03:57:36.690622 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.689125 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 03:57:36.690622 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.689129 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 03:57:36.690622 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.689133 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 03:57:36.690622 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.689138 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 03:57:36.690622 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.689141 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 03:57:36.690622 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.689146 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 03:57:36.690622 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.689151 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 03:57:36.690622 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.689155 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 03:57:36.690622 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.689159 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 03:57:36.690622 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.689163 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 03:57:36.690622 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.689167 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 03:57:36.690622 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.689172 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 03:57:36.690622 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.689176 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 03:57:36.691112 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.689180 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 03:57:36.691112 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.689184 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 03:57:36.691112 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.689188 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 03:57:36.691112 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.689192 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 03:57:36.691112 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.689196 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 03:57:36.691112 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.689202 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 03:57:36.691112 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.689206 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 03:57:36.691112 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689323 2578 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 03:57:36.691112 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689341 2578 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 03:57:36.691112 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689351 2578 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 03:57:36.691112 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689359 2578 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 03:57:36.691112 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689366 2578 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 03:57:36.691112 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689372 2578 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 03:57:36.691112 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689378 2578 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 03:57:36.691112 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689386 2578 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 03:57:36.691112 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689391 2578 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 03:57:36.691112 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689396 2578 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 03:57:36.691112 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689402 2578 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 03:57:36.691112 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689407 2578 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 03:57:36.691112 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689412 2578 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 03:57:36.691112 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689418 2578 flags.go:64] FLAG: --cgroup-root="" Apr 21 03:57:36.691112 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689423 2578 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 03:57:36.691112 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689428 2578 flags.go:64] FLAG: --client-ca-file="" Apr 21 03:57:36.691712 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689432 2578 flags.go:64] FLAG: --cloud-config="" Apr 21 03:57:36.691712 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689437 2578 flags.go:64] FLAG: --cloud-provider="external" Apr 21 03:57:36.691712 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689443 2578 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 03:57:36.691712 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689450 2578 flags.go:64] FLAG: --cluster-domain="" Apr 21 03:57:36.691712 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689454 2578 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 03:57:36.691712 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689459 2578 flags.go:64] FLAG: --config-dir="" Apr 21 03:57:36.691712 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689464 2578 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 03:57:36.691712 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689470 2578 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 03:57:36.691712 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689527 2578 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 03:57:36.691712 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689532 2578 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 03:57:36.691712 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689537 2578 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 03:57:36.691712 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689542 2578 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 03:57:36.691712 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689546 2578 flags.go:64] FLAG: --contention-profiling="false" Apr 21 03:57:36.691712 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689550 2578 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 03:57:36.691712 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689554 2578 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 03:57:36.691712 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689558 2578 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 03:57:36.691712 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689561 2578 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 03:57:36.691712 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689567 2578 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 03:57:36.691712 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689570 2578 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 03:57:36.691712 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689573 2578 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 03:57:36.691712 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689576 2578 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 03:57:36.691712 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689579 2578 flags.go:64] FLAG: --enable-server="true" Apr 21 03:57:36.691712 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689583 2578 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 03:57:36.691712 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689589 2578 flags.go:64] FLAG: --event-burst="100" Apr 21 03:57:36.691712 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689592 2578 flags.go:64] FLAG: --event-qps="50" Apr 21 03:57:36.692333 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689596 2578 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 03:57:36.692333 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689599 2578 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 03:57:36.692333 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689602 2578 flags.go:64] FLAG: --eviction-hard="" Apr 21 03:57:36.692333 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689606 2578 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 03:57:36.692333 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689609 2578 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 03:57:36.692333 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689612 2578 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 03:57:36.692333 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689615 2578 flags.go:64] FLAG: --eviction-soft="" Apr 21 03:57:36.692333 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689618 2578 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 03:57:36.692333 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689621 2578 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 03:57:36.692333 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689624 2578 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 03:57:36.692333 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689627 2578 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 03:57:36.692333 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689630 2578 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 03:57:36.692333 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689633 2578 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 03:57:36.692333 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689636 2578 flags.go:64] FLAG: --feature-gates="" Apr 21 03:57:36.692333 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689640 2578 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 03:57:36.692333 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689645 2578 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 03:57:36.692333 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689649 2578 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 03:57:36.692333 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689652 2578 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 03:57:36.692333 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689657 2578 flags.go:64] FLAG: --healthz-port="10248" Apr 21 03:57:36.692333 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689660 2578 flags.go:64] FLAG: --help="false" Apr 21 03:57:36.692333 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689663 2578 flags.go:64] FLAG: --hostname-override="ip-10-0-134-136.ec2.internal" Apr 21 03:57:36.692333 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689667 2578 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 03:57:36.692333 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689670 2578 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 03:57:36.692333 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689674 2578 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 03:57:36.692950 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689678 2578 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 03:57:36.692950 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689682 2578 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 03:57:36.692950 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689685 2578 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 03:57:36.692950 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689688 2578 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 03:57:36.692950 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689691 2578 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 03:57:36.692950 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689694 2578 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 03:57:36.692950 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689697 2578 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 03:57:36.692950 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689701 2578 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 03:57:36.692950 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689704 2578 flags.go:64] FLAG: --kube-reserved="" Apr 21 03:57:36.692950 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689707 2578 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 03:57:36.692950 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689710 2578 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 03:57:36.692950 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689713 2578 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 03:57:36.692950 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689716 2578 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 03:57:36.692950 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689719 2578 flags.go:64] FLAG: --lock-file="" Apr 21 03:57:36.692950 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689722 2578 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 03:57:36.692950 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689725 2578 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 03:57:36.692950 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689728 2578 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 03:57:36.692950 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689734 2578 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 03:57:36.692950 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689737 2578 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 03:57:36.692950 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689740 2578 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 03:57:36.692950 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689743 2578 flags.go:64] FLAG: --logging-format="text" Apr 21 03:57:36.692950 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689746 2578 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 03:57:36.692950 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689749 2578 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 03:57:36.692950 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689753 2578 flags.go:64] FLAG: --manifest-url="" Apr 21 03:57:36.693543 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689756 2578 flags.go:64] FLAG: --manifest-url-header="" Apr 21 03:57:36.693543 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689761 2578 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 03:57:36.693543 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689765 2578 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 03:57:36.693543 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689769 2578 flags.go:64] FLAG: --max-pods="110" Apr 21 03:57:36.693543 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689772 2578 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 03:57:36.693543 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689776 2578 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 03:57:36.693543 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689778 2578 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 03:57:36.693543 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689782 2578 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 03:57:36.693543 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689785 2578 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 03:57:36.693543 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689788 2578 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 03:57:36.693543 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689791 2578 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 03:57:36.693543 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689800 2578 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 03:57:36.693543 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689803 2578 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 03:57:36.693543 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689806 2578 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 03:57:36.693543 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689809 2578 flags.go:64] FLAG: --pod-cidr="" Apr 21 03:57:36.693543 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689812 2578 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 03:57:36.693543 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689818 2578 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 03:57:36.693543 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689821 2578 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 03:57:36.693543 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689825 2578 flags.go:64] FLAG: --pods-per-core="0" Apr 21 03:57:36.693543 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689827 2578 flags.go:64] FLAG: --port="10250" Apr 21 03:57:36.693543 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689831 2578 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 03:57:36.693543 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689834 2578 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-017ce7a85cfd61992" Apr 21 03:57:36.693543 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689838 2578 flags.go:64] FLAG: --qos-reserved="" Apr 21 03:57:36.693543 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689841 2578 flags.go:64] FLAG: --read-only-port="10255" Apr 21 03:57:36.694110 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689844 2578 flags.go:64] FLAG: --register-node="true" Apr 21 03:57:36.694110 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689847 2578 flags.go:64] FLAG: --register-schedulable="true" Apr 21 03:57:36.694110 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689850 2578 flags.go:64] FLAG: --register-with-taints="" Apr 21 03:57:36.694110 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689853 2578 flags.go:64] FLAG: --registry-burst="10" Apr 21 03:57:36.694110 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689856 2578 flags.go:64] FLAG: --registry-qps="5" Apr 21 03:57:36.694110 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689859 2578 flags.go:64] FLAG: --reserved-cpus="" Apr 21 03:57:36.694110 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689862 2578 flags.go:64] FLAG: --reserved-memory="" Apr 21 03:57:36.694110 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689866 2578 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 03:57:36.694110 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689870 2578 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 03:57:36.694110 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689873 2578 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 03:57:36.694110 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689877 2578 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 03:57:36.694110 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689880 2578 flags.go:64] FLAG: --runonce="false" Apr 21 03:57:36.694110 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689883 2578 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 03:57:36.694110 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689886 2578 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 03:57:36.694110 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689889 2578 flags.go:64] FLAG: --seccomp-default="false" Apr 21 03:57:36.694110 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689892 2578 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 03:57:36.694110 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689895 2578 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 03:57:36.694110 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689898 2578 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 03:57:36.694110 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689901 2578 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 03:57:36.694110 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689904 2578 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 03:57:36.694110 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689907 2578 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 03:57:36.694110 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689910 2578 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 03:57:36.694110 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689913 2578 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 03:57:36.694110 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689917 2578 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 03:57:36.694110 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689920 2578 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 03:57:36.694110 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689923 2578 flags.go:64] FLAG: --system-cgroups="" Apr 21 03:57:36.694785 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689926 2578 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 03:57:36.694785 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689933 2578 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 03:57:36.694785 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689936 2578 flags.go:64] FLAG: --tls-cert-file="" Apr 21 03:57:36.694785 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689939 2578 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 03:57:36.694785 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689943 2578 flags.go:64] FLAG: --tls-min-version="" Apr 21 03:57:36.694785 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689946 2578 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 03:57:36.694785 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689949 2578 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 03:57:36.694785 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689952 2578 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 03:57:36.694785 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689955 2578 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 03:57:36.694785 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689958 2578 flags.go:64] FLAG: --v="2" Apr 21 03:57:36.694785 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689963 2578 flags.go:64] FLAG: --version="false" Apr 21 03:57:36.694785 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689967 2578 flags.go:64] FLAG: --vmodule="" Apr 21 03:57:36.694785 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689971 2578 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 03:57:36.694785 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.689975 2578 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 03:57:36.694785 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690076 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 03:57:36.694785 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690080 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 03:57:36.694785 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690084 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 03:57:36.694785 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690088 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 03:57:36.694785 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690091 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 21 03:57:36.694785 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690093 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 03:57:36.694785 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690096 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 03:57:36.694785 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690099 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 03:57:36.694785 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690101 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 03:57:36.695360 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690104 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 03:57:36.695360 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690107 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 03:57:36.695360 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690109 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 03:57:36.695360 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690112 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 03:57:36.695360 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690114 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 03:57:36.695360 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690117 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 03:57:36.695360 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690120 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 03:57:36.695360 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690122 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 03:57:36.695360 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690125 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 03:57:36.695360 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690128 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 03:57:36.695360 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690130 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 03:57:36.695360 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690133 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 03:57:36.695360 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690136 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 03:57:36.695360 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690139 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 03:57:36.695360 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690141 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 03:57:36.695360 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690144 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 03:57:36.695360 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690146 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 03:57:36.695360 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690149 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 03:57:36.695360 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690151 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 03:57:36.695852 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690154 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 03:57:36.695852 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690156 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 03:57:36.695852 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690159 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 03:57:36.695852 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690162 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 03:57:36.695852 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690166 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 03:57:36.695852 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690170 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 03:57:36.695852 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690172 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 03:57:36.695852 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690175 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 03:57:36.695852 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690178 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 03:57:36.695852 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690181 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 03:57:36.695852 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690183 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 03:57:36.695852 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690186 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 03:57:36.695852 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690188 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 03:57:36.695852 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690191 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 03:57:36.695852 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690194 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 03:57:36.695852 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690196 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 03:57:36.695852 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690199 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 03:57:36.695852 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690201 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 03:57:36.695852 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690204 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 03:57:36.695852 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690206 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 03:57:36.696560 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690209 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 03:57:36.696560 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690211 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 03:57:36.696560 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690214 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 03:57:36.696560 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690217 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 03:57:36.696560 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690220 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 03:57:36.696560 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690223 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 03:57:36.696560 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690225 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 03:57:36.696560 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690228 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 03:57:36.696560 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690230 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 03:57:36.696560 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690234 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 03:57:36.696560 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690237 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 03:57:36.696560 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690240 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 03:57:36.696560 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690243 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 03:57:36.696560 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690245 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 03:57:36.696560 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690249 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 03:57:36.696560 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690252 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 03:57:36.696560 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690255 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 03:57:36.696560 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690266 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 03:57:36.696560 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690269 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 03:57:36.697198 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690271 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 03:57:36.697198 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690274 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 03:57:36.697198 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690294 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 03:57:36.697198 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690300 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 03:57:36.697198 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690304 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 03:57:36.697198 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690307 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 03:57:36.697198 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690309 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 03:57:36.697198 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690312 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 03:57:36.697198 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690315 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 03:57:36.697198 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690318 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 03:57:36.697198 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690320 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 03:57:36.697198 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690323 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 03:57:36.697198 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690326 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 03:57:36.697198 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690328 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 03:57:36.697198 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690331 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 03:57:36.697198 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690333 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 03:57:36.697198 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690336 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 03:57:36.697198 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690338 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 03:57:36.697198 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.690341 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 03:57:36.697686 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.690347 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 03:57:36.698612 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.698591 2578 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 03:57:36.698646 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.698613 2578 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 03:57:36.698679 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698663 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 03:57:36.698679 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698669 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 03:57:36.698679 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698673 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 03:57:36.698679 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698676 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 03:57:36.698679 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698678 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 03:57:36.698679 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698681 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 03:57:36.698831 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698684 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 03:57:36.698831 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698687 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 03:57:36.698831 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698690 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 03:57:36.698831 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698693 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 03:57:36.698831 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698695 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 03:57:36.698831 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698698 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 03:57:36.698831 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698701 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 03:57:36.698831 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698704 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 03:57:36.698831 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698722 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 03:57:36.698831 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698725 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 03:57:36.698831 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698729 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 03:57:36.698831 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698731 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 03:57:36.698831 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698734 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 03:57:36.698831 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698737 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 03:57:36.698831 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698740 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 03:57:36.698831 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698743 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 03:57:36.698831 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698747 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 03:57:36.698831 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698749 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 03:57:36.698831 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698752 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 03:57:36.698831 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698755 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 03:57:36.699555 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698758 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 03:57:36.699555 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698761 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 03:57:36.699555 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698764 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 03:57:36.699555 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698767 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 03:57:36.699555 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698770 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 03:57:36.699555 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698773 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 03:57:36.699555 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698775 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 03:57:36.699555 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698778 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 03:57:36.699555 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698780 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 03:57:36.699555 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698783 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 03:57:36.699555 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698785 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 03:57:36.699555 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698788 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 21 03:57:36.699555 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698791 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 03:57:36.699555 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698794 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 03:57:36.699555 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698797 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 03:57:36.699555 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698799 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 03:57:36.699555 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698802 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 03:57:36.699555 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698805 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 03:57:36.699555 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698807 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 03:57:36.699555 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698810 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 03:57:36.700058 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698812 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 03:57:36.700058 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698815 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 03:57:36.700058 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698818 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 03:57:36.700058 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698820 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 03:57:36.700058 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698823 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 03:57:36.700058 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698826 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 03:57:36.700058 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698829 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 03:57:36.700058 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698832 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 03:57:36.700058 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698835 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 03:57:36.700058 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698837 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 03:57:36.700058 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698840 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 03:57:36.700058 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698843 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 03:57:36.700058 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698846 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 03:57:36.700058 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698849 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 03:57:36.700058 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698851 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 03:57:36.700058 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698854 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 03:57:36.700058 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698857 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 03:57:36.700058 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698859 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 03:57:36.700058 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698862 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 03:57:36.700532 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698865 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 03:57:36.700532 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698868 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 03:57:36.700532 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698870 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 03:57:36.700532 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698873 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 03:57:36.700532 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698876 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 03:57:36.700532 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698880 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 03:57:36.700532 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698883 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 03:57:36.700532 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698886 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 03:57:36.700532 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698889 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 03:57:36.700532 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698891 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 03:57:36.700532 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698894 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 03:57:36.700532 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698897 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 03:57:36.700532 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698900 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 03:57:36.700532 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698902 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 03:57:36.700532 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698905 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 03:57:36.700532 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698908 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 03:57:36.700532 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698911 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 03:57:36.700532 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698913 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 03:57:36.700532 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698916 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 03:57:36.700996 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698920 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 03:57:36.700996 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.698924 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 03:57:36.700996 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.698929 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 03:57:36.700996 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699028 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 03:57:36.700996 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699034 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 03:57:36.700996 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699037 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 03:57:36.700996 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699040 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 03:57:36.700996 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699043 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 03:57:36.700996 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699046 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 03:57:36.700996 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699049 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 03:57:36.700996 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699051 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 03:57:36.700996 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699054 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 03:57:36.700996 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699056 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 03:57:36.700996 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699059 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 03:57:36.700996 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699061 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 03:57:36.701380 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699064 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 03:57:36.701380 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699067 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 03:57:36.701380 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699070 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 03:57:36.701380 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699072 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 03:57:36.701380 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699075 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 03:57:36.701380 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699077 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 03:57:36.701380 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699080 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 03:57:36.701380 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699082 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 03:57:36.701380 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699085 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 03:57:36.701380 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699087 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 03:57:36.701380 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699090 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 03:57:36.701380 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699093 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 03:57:36.701380 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699096 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 03:57:36.701380 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699099 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 03:57:36.701380 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699102 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 03:57:36.701380 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699104 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 03:57:36.701380 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699108 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 03:57:36.701380 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699112 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 03:57:36.701380 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699115 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 03:57:36.701380 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699118 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 03:57:36.701883 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699120 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 03:57:36.701883 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699123 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 03:57:36.701883 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699126 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 03:57:36.701883 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699128 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 03:57:36.701883 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699131 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 03:57:36.701883 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699133 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 03:57:36.701883 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699137 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 03:57:36.701883 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699141 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 03:57:36.701883 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699144 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 03:57:36.701883 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699147 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 03:57:36.701883 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699149 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 03:57:36.701883 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699152 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 21 03:57:36.701883 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699155 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 03:57:36.701883 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699159 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 03:57:36.701883 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699161 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 03:57:36.701883 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699164 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 03:57:36.701883 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699166 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 03:57:36.701883 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699169 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 03:57:36.701883 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699172 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 03:57:36.701883 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699174 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 03:57:36.702396 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699177 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 03:57:36.702396 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699179 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 03:57:36.702396 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699182 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 03:57:36.702396 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699185 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 03:57:36.702396 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699188 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 03:57:36.702396 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699191 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 03:57:36.702396 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699193 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 03:57:36.702396 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699196 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 03:57:36.702396 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699199 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 03:57:36.702396 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699201 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 03:57:36.702396 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699204 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 03:57:36.702396 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699206 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 03:57:36.702396 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699209 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 03:57:36.702396 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699211 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 03:57:36.702396 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699214 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 03:57:36.702396 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699216 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 03:57:36.702396 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699219 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 03:57:36.702396 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699221 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 03:57:36.702396 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699224 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 03:57:36.702396 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699226 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 03:57:36.702876 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699229 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 03:57:36.702876 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699231 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 03:57:36.702876 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699234 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 03:57:36.702876 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699236 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 03:57:36.702876 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699239 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 03:57:36.702876 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699242 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 03:57:36.702876 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699245 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 03:57:36.702876 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699247 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 03:57:36.702876 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699250 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 03:57:36.702876 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699252 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 03:57:36.702876 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699255 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 03:57:36.702876 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699257 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 03:57:36.702876 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699260 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 03:57:36.702876 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:36.699262 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 03:57:36.702876 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.699267 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 03:57:36.703254 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.700062 2578 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 03:57:36.706033 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.706017 2578 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 03:57:36.707056 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.707044 2578 server.go:1019] "Starting client certificate rotation" Apr 21 03:57:36.707160 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.707142 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 03:57:36.707195 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.707183 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 03:57:36.733858 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.733839 2578 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 03:57:36.735648 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.735631 2578 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 03:57:36.749013 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.748992 2578 log.go:25] "Validated CRI v1 runtime API" Apr 21 03:57:36.756177 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.756158 2578 log.go:25] "Validated CRI v1 image API" Apr 21 03:57:36.757469 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.757453 2578 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 03:57:36.761747 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.761718 2578 fs.go:135] Filesystem UUIDs: map[5d90eb5f-0d80-4847-ac05-11b1f1460809:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 a678d8df-846c-48a2-b0e2-7c3d3701eb1b:/dev/nvme0n1p4] Apr 21 03:57:36.761826 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.761745 2578 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 03:57:36.764065 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.764048 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 03:57:36.768810 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.768510 2578 manager.go:217] Machine: {Timestamp:2026-04-21 03:57:36.765589342 +0000 UTC m=+0.435028413 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3102125 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec275800ffe7298bc5d73c278ebcf4f3 SystemUUID:ec275800-ffe7-298b-c5d7-3c278ebcf4f3 BootID:6c5fd654-95d9-4df9-8915-2b19e3563672 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:25:59:48:47:69 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:25:59:48:47:69 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:de:8b:0d:b1:a5:ba Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 03:57:36.768810 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.768800 2578 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 03:57:36.768940 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.768903 2578 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 03:57:36.770213 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.770191 2578 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 03:57:36.770373 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.770214 2578 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-134-136.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 03:57:36.770423 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.770382 2578 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 03:57:36.770423 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.770391 2578 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 03:57:36.770423 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.770405 2578 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 03:57:36.770505 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.770425 2578 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 03:57:36.772004 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.771994 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 21 03:57:36.772116 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.772107 2578 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 03:57:36.775260 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.775251 2578 kubelet.go:491] "Attempting to sync node with API server" Apr 21 03:57:36.775308 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.775264 2578 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 03:57:36.775308 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.775289 2578 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 03:57:36.775308 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.775302 2578 kubelet.go:397] "Adding apiserver pod source" Apr 21 03:57:36.775445 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.775324 2578 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 03:57:36.776566 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.776554 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 03:57:36.776610 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.776571 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 03:57:36.779735 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.779717 2578 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 03:57:36.781111 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.781098 2578 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 03:57:36.782506 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.782265 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-qbjl7" Apr 21 03:57:36.782716 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.782701 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 03:57:36.782778 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.782720 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 03:57:36.782778 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.782726 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 03:57:36.782778 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.782732 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 03:57:36.782778 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.782739 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 03:57:36.782778 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.782748 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 03:57:36.782778 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.782754 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 03:57:36.782778 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.782760 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 03:57:36.782778 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.782768 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 03:57:36.782778 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.782774 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 03:57:36.782778 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.782782 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 03:57:36.783028 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.782792 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 03:57:36.783716 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.783707 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 03:57:36.783757 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.783717 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 03:57:36.787686 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.787673 2578 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 03:57:36.787816 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.787805 2578 server.go:1295] "Started kubelet" Apr 21 03:57:36.787897 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.787841 2578 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 03:57:36.787989 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.787941 2578 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 03:57:36.788031 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.788019 2578 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 03:57:36.788059 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:36.788043 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-134-136.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 21 03:57:36.788113 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.788100 2578 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-136.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 21 03:57:36.788221 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:36.788208 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 21 03:57:36.788769 ip-10-0-134-136 systemd[1]: Started Kubernetes Kubelet. Apr 21 03:57:36.789130 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.789113 2578 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 03:57:36.790179 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.790157 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-qbjl7" Apr 21 03:57:36.790384 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.790371 2578 server.go:317] "Adding debug handlers to kubelet server" Apr 21 03:57:36.795866 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.795846 2578 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 03:57:36.795866 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.795858 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 03:57:36.796564 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.796504 2578 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 03:57:36.796564 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.796550 2578 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 03:57:36.796693 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.796506 2578 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 03:57:36.796693 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.796680 2578 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 03:57:36.796693 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.796684 2578 reconstruct.go:97] "Volume reconstruction finished" Apr 21 03:57:36.796693 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.796691 2578 factory.go:55] Registering systemd factory Apr 21 03:57:36.796693 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.796695 2578 reconciler.go:26] "Reconciler: start to sync state" Apr 21 03:57:36.796908 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.796697 2578 factory.go:223] Registration of the systemd container factory successfully Apr 21 03:57:36.796962 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:36.796911 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-136.ec2.internal\" not found" Apr 21 03:57:36.796962 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.796940 2578 factory.go:153] Registering CRI-O factory Apr 21 03:57:36.796962 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.796955 2578 factory.go:223] Registration of the crio container factory successfully Apr 21 03:57:36.797085 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.796978 2578 factory.go:103] Registering Raw factory Apr 21 03:57:36.797085 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.796992 2578 manager.go:1196] Started watching for new ooms in manager Apr 21 03:57:36.797465 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.797450 2578 manager.go:319] Starting recovery of all containers Apr 21 03:57:36.803032 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.803004 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 03:57:36.803904 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:36.803882 2578 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-134-136.ec2.internal\" not found" node="ip-10-0-134-136.ec2.internal" Apr 21 03:57:36.804345 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:36.804325 2578 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 21 03:57:36.812857 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.812677 2578 manager.go:324] Recovery completed Apr 21 03:57:36.817163 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.817149 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 03:57:36.819713 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.819696 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-136.ec2.internal" event="NodeHasSufficientMemory" Apr 21 03:57:36.819795 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.819726 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-136.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 03:57:36.819795 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.819736 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-136.ec2.internal" event="NodeHasSufficientPID" Apr 21 03:57:36.820216 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.820204 2578 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 03:57:36.820216 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.820213 2578 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 03:57:36.820321 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.820231 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 21 03:57:36.822735 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.822723 2578 policy_none.go:49] "None policy: Start" Apr 21 03:57:36.822776 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.822739 2578 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 03:57:36.822776 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.822748 2578 state_mem.go:35] "Initializing new in-memory state store" Apr 21 03:57:36.866851 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.866834 2578 manager.go:341] "Starting Device Plugin manager" Apr 21 03:57:36.875441 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:36.866896 2578 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 03:57:36.875441 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.866910 2578 server.go:85] "Starting device plugin registration server" Apr 21 03:57:36.875441 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.867152 2578 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 03:57:36.875441 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.867163 2578 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 03:57:36.875441 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.867250 2578 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 03:57:36.875441 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.867341 2578 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 03:57:36.875441 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.867350 2578 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 03:57:36.875441 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:36.867832 2578 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 03:57:36.875441 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:36.867868 2578 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-134-136.ec2.internal\" not found" Apr 21 03:57:36.922549 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.922504 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 03:57:36.923709 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.923691 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 03:57:36.923782 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.923725 2578 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 03:57:36.923782 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.923750 2578 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 03:57:36.923782 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.923758 2578 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 03:57:36.923921 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:36.923797 2578 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 03:57:36.925757 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.925733 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 03:57:36.967434 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.967378 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 03:57:36.968671 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.968655 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-136.ec2.internal" event="NodeHasSufficientMemory" Apr 21 03:57:36.968740 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.968686 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-136.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 03:57:36.968740 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.968697 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-136.ec2.internal" event="NodeHasSufficientPID" Apr 21 03:57:36.968740 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.968718 2578 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-134-136.ec2.internal" Apr 21 03:57:36.974890 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:36.974873 2578 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-134-136.ec2.internal" Apr 21 03:57:36.974970 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:36.974896 2578 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-134-136.ec2.internal\": node \"ip-10-0-134-136.ec2.internal\" not found" Apr 21 03:57:36.995010 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:36.994995 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-136.ec2.internal\" not found" Apr 21 03:57:37.024327 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.024303 2578 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-136.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-134-136.ec2.internal"] Apr 21 03:57:37.024410 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.024381 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 03:57:37.025827 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.025813 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-136.ec2.internal" event="NodeHasSufficientMemory" Apr 21 03:57:37.025918 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.025846 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-136.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 03:57:37.025918 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.025860 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-136.ec2.internal" event="NodeHasSufficientPID" Apr 21 03:57:37.027455 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.027438 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 03:57:37.027590 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.027576 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-136.ec2.internal" Apr 21 03:57:37.027633 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.027606 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 03:57:37.028069 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.028053 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-136.ec2.internal" event="NodeHasSufficientMemory" Apr 21 03:57:37.028069 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.028079 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-136.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 03:57:37.028190 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.028091 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-136.ec2.internal" event="NodeHasSufficientPID" Apr 21 03:57:37.028190 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.028053 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-136.ec2.internal" event="NodeHasSufficientMemory" Apr 21 03:57:37.028190 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.028165 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-136.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 03:57:37.028190 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.028179 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-136.ec2.internal" event="NodeHasSufficientPID" Apr 21 03:57:37.029256 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.029244 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-136.ec2.internal" Apr 21 03:57:37.029319 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.029265 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 03:57:37.029984 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.029969 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-136.ec2.internal" event="NodeHasSufficientMemory" Apr 21 03:57:37.030072 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.029996 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-136.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 03:57:37.030072 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.030010 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-136.ec2.internal" event="NodeHasSufficientPID" Apr 21 03:57:37.056981 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:37.056964 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-136.ec2.internal\" not found" node="ip-10-0-134-136.ec2.internal" Apr 21 03:57:37.061752 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:37.061737 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-136.ec2.internal\" not found" node="ip-10-0-134-136.ec2.internal" Apr 21 03:57:37.095256 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:37.095228 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-136.ec2.internal\" not found" Apr 21 03:57:37.098424 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.098406 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7d94c65271f325dffc877a062645ae79-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-136.ec2.internal\" (UID: \"7d94c65271f325dffc877a062645ae79\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-136.ec2.internal" Apr 21 03:57:37.195768 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:37.195732 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-136.ec2.internal\" not found" Apr 21 03:57:37.199025 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.199008 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7d94c65271f325dffc877a062645ae79-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-136.ec2.internal\" (UID: \"7d94c65271f325dffc877a062645ae79\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-136.ec2.internal" Apr 21 03:57:37.199077 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.199039 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7d94c65271f325dffc877a062645ae79-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-136.ec2.internal\" (UID: \"7d94c65271f325dffc877a062645ae79\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-136.ec2.internal" Apr 21 03:57:37.199077 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.199056 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/68f04465e53eef24d16ecd9de5ad5a12-config\") pod \"kube-apiserver-proxy-ip-10-0-134-136.ec2.internal\" (UID: \"68f04465e53eef24d16ecd9de5ad5a12\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-136.ec2.internal" Apr 21 03:57:37.199145 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.199105 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7d94c65271f325dffc877a062645ae79-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-136.ec2.internal\" (UID: \"7d94c65271f325dffc877a062645ae79\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-136.ec2.internal" Apr 21 03:57:37.296491 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:37.296464 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-136.ec2.internal\" not found" Apr 21 03:57:37.299714 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.299700 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7d94c65271f325dffc877a062645ae79-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-136.ec2.internal\" (UID: \"7d94c65271f325dffc877a062645ae79\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-136.ec2.internal" Apr 21 03:57:37.299758 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.299722 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/68f04465e53eef24d16ecd9de5ad5a12-config\") pod \"kube-apiserver-proxy-ip-10-0-134-136.ec2.internal\" (UID: \"68f04465e53eef24d16ecd9de5ad5a12\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-136.ec2.internal" Apr 21 03:57:37.299758 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.299747 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/68f04465e53eef24d16ecd9de5ad5a12-config\") pod \"kube-apiserver-proxy-ip-10-0-134-136.ec2.internal\" (UID: \"68f04465e53eef24d16ecd9de5ad5a12\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-136.ec2.internal" Apr 21 03:57:37.299819 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.299784 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7d94c65271f325dffc877a062645ae79-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-136.ec2.internal\" (UID: \"7d94c65271f325dffc877a062645ae79\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-136.ec2.internal" Apr 21 03:57:37.358903 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.358866 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-136.ec2.internal" Apr 21 03:57:37.364568 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.364549 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-136.ec2.internal" Apr 21 03:57:37.397357 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:37.397326 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-136.ec2.internal\" not found" Apr 21 03:57:37.497798 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:37.497761 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-136.ec2.internal\" not found" Apr 21 03:57:37.598220 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:37.598146 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-136.ec2.internal\" not found" Apr 21 03:57:37.698782 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:37.698748 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-136.ec2.internal\" not found" Apr 21 03:57:37.708049 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.708024 2578 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 03:57:37.708196 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.708179 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 03:57:37.708251 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.708211 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 03:57:37.740910 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.740886 2578 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 03:57:37.775958 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.775934 2578 apiserver.go:52] "Watching apiserver" Apr 21 03:57:37.785918 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.785901 2578 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 03:57:37.789331 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.789305 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-j6rnj","openshift-multus/network-metrics-daemon-2lrq9","openshift-network-diagnostics/network-check-target-n6d2m","openshift-network-operator/iptables-alerter-7j4lb","kube-system/konnectivity-agent-7vx69","openshift-multus/multus-additional-cni-plugins-kmxxq","openshift-ovn-kubernetes/ovnkube-node-82wml","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648s8","openshift-cluster-node-tuning-operator/tuned-x9r2c","openshift-dns/node-resolver-jxqfz","openshift-image-registry/node-ca-jpnv7"] Apr 21 03:57:37.792225 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.792203 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-j6rnj" Apr 21 03:57:37.792352 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.792256 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 03:52:36 +0000 UTC" deadline="2027-12-19 06:13:36.568876586 +0000 UTC" Apr 21 03:57:37.792352 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.792312 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2lrq9" Apr 21 03:57:37.792352 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.792310 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14570h15m58.77657044s" Apr 21 03:57:37.792509 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:37.792380 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2lrq9" podUID="6e98b17f-4794-44aa-8756-58a9bd9cb37a" Apr 21 03:57:37.793482 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.793465 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6d2m" Apr 21 03:57:37.793558 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:37.793515 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n6d2m" podUID="b5d6aff7-e2e1-4646-9d82-8c931e49196e" Apr 21 03:57:37.794558 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.794544 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 03:57:37.794776 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.794763 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 03:57:37.794831 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.794790 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-7j4lb" Apr 21 03:57:37.795000 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.794870 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 03:57:37.795000 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.794892 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 03:57:37.795179 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.795164 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-zlc75\"" Apr 21 03:57:37.795986 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.795968 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 03:57:37.796101 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.796088 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-7vx69" Apr 21 03:57:37.796221 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.796203 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-136.ec2.internal" Apr 21 03:57:37.797203 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.797175 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 03:57:37.797362 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.797346 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 03:57:37.797433 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.797378 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 03:57:37.797522 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.797467 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kmxxq" Apr 21 03:57:37.797522 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.797497 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-6cd8b\"" Apr 21 03:57:37.798365 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.798350 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-p2fp7\"" Apr 21 03:57:37.798455 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.798352 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 03:57:37.798455 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.798400 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 03:57:37.798890 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.798872 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:37.800160 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.800142 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648s8" Apr 21 03:57:37.800607 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.800591 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 03:57:37.801120 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.801105 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 03:57:37.801427 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.801412 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 03:57:37.801427 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.801422 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 03:57:37.801656 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.801640 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-4f8rx\"" Apr 21 03:57:37.801723 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.801647 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-6l8s2\"" Apr 21 03:57:37.801723 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.801677 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 03:57:37.801821 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.801732 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 03:57:37.801821 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.801736 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 03:57:37.801821 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.801766 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 03:57:37.802005 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.801849 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-x9r2c" Apr 21 03:57:37.802065 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.802039 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b-system-cni-dir\") pod \"multus-additional-cni-plugins-kmxxq\" (UID: \"cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b\") " pod="openshift-multus/multus-additional-cni-plugins-kmxxq" Apr 21 03:57:37.802108 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.802065 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6-cnibin\") pod \"multus-j6rnj\" (UID: \"7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6\") " pod="openshift-multus/multus-j6rnj" Apr 21 03:57:37.802108 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.802081 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6-hostroot\") pod \"multus-j6rnj\" (UID: \"7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6\") " pod="openshift-multus/multus-j6rnj" Apr 21 03:57:37.802173 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.802115 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6-multus-daemon-config\") pod \"multus-j6rnj\" (UID: \"7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6\") " pod="openshift-multus/multus-j6rnj" Apr 21 03:57:37.802173 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.802138 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxfcv\" (UniqueName: \"kubernetes.io/projected/b4a64573-596e-4e34-a0d2-ec31f17a6ba5-kube-api-access-fxfcv\") pod \"iptables-alerter-7j4lb\" (UID: \"b4a64573-596e-4e34-a0d2-ec31f17a6ba5\") " pod="openshift-network-operator/iptables-alerter-7j4lb" Apr 21 03:57:37.802173 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.802155 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kmxxq\" (UID: \"cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b\") " pod="openshift-multus/multus-additional-cni-plugins-kmxxq" Apr 21 03:57:37.802173 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.802170 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6-host-run-k8s-cni-cncf-io\") pod \"multus-j6rnj\" (UID: \"7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6\") " pod="openshift-multus/multus-j6rnj" Apr 21 03:57:37.802316 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.802193 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6-host-var-lib-cni-multus\") pod \"multus-j6rnj\" (UID: \"7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6\") " pod="openshift-multus/multus-j6rnj" Apr 21 03:57:37.802316 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.802241 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b-cni-binary-copy\") pod \"multus-additional-cni-plugins-kmxxq\" (UID: \"cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b\") " pod="openshift-multus/multus-additional-cni-plugins-kmxxq" Apr 21 03:57:37.802316 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.802270 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b-os-release\") pod \"multus-additional-cni-plugins-kmxxq\" (UID: \"cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b\") " pod="openshift-multus/multus-additional-cni-plugins-kmxxq" Apr 21 03:57:37.802415 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.802314 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6-multus-conf-dir\") pod \"multus-j6rnj\" (UID: \"7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6\") " pod="openshift-multus/multus-j6rnj" Apr 21 03:57:37.802415 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.802337 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6-etc-kubernetes\") pod \"multus-j6rnj\" (UID: \"7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6\") " pod="openshift-multus/multus-j6rnj" Apr 21 03:57:37.802415 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.802358 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kdj9\" (UniqueName: \"kubernetes.io/projected/6e98b17f-4794-44aa-8756-58a9bd9cb37a-kube-api-access-5kdj9\") pod \"network-metrics-daemon-2lrq9\" (UID: \"6e98b17f-4794-44aa-8756-58a9bd9cb37a\") " pod="openshift-multus/network-metrics-daemon-2lrq9" Apr 21 03:57:37.802415 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.802383 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jhrv\" (UniqueName: \"kubernetes.io/projected/b5d6aff7-e2e1-4646-9d82-8c931e49196e-kube-api-access-8jhrv\") pod \"network-check-target-n6d2m\" (UID: \"b5d6aff7-e2e1-4646-9d82-8c931e49196e\") " pod="openshift-network-diagnostics/network-check-target-n6d2m" Apr 21 03:57:37.802415 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.802402 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6-cni-binary-copy\") pod \"multus-j6rnj\" (UID: \"7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6\") " pod="openshift-multus/multus-j6rnj" Apr 21 03:57:37.802607 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.802418 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/adbc5b28-9686-4704-b170-0ada296e15b8-agent-certs\") pod \"konnectivity-agent-7vx69\" (UID: \"adbc5b28-9686-4704-b170-0ada296e15b8\") " pod="kube-system/konnectivity-agent-7vx69" Apr 21 03:57:37.802607 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.802444 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/adbc5b28-9686-4704-b170-0ada296e15b8-konnectivity-ca\") pod \"konnectivity-agent-7vx69\" (UID: \"adbc5b28-9686-4704-b170-0ada296e15b8\") " pod="kube-system/konnectivity-agent-7vx69" Apr 21 03:57:37.802607 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.802468 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh8tm\" (UniqueName: \"kubernetes.io/projected/cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b-kube-api-access-bh8tm\") pod \"multus-additional-cni-plugins-kmxxq\" (UID: \"cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b\") " pod="openshift-multus/multus-additional-cni-plugins-kmxxq" Apr 21 03:57:37.802607 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.802485 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6-host-run-netns\") pod \"multus-j6rnj\" (UID: \"7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6\") " pod="openshift-multus/multus-j6rnj" Apr 21 03:57:37.802607 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.802498 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6-host-var-lib-kubelet\") pod \"multus-j6rnj\" (UID: \"7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6\") " pod="openshift-multus/multus-j6rnj" Apr 21 03:57:37.802607 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.802520 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6e98b17f-4794-44aa-8756-58a9bd9cb37a-metrics-certs\") pod \"network-metrics-daemon-2lrq9\" (UID: \"6e98b17f-4794-44aa-8756-58a9bd9cb37a\") " pod="openshift-multus/network-metrics-daemon-2lrq9" Apr 21 03:57:37.802607 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.802544 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 03:57:37.802607 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.802553 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b-cnibin\") pod \"multus-additional-cni-plugins-kmxxq\" (UID: \"cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b\") " pod="openshift-multus/multus-additional-cni-plugins-kmxxq" Apr 21 03:57:37.802607 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.802568 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6-host-run-multus-certs\") pod \"multus-j6rnj\" (UID: \"7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6\") " pod="openshift-multus/multus-j6rnj" Apr 21 03:57:37.802607 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.802582 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6-os-release\") pod \"multus-j6rnj\" (UID: \"7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6\") " pod="openshift-multus/multus-j6rnj" Apr 21 03:57:37.802941 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.802620 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6-multus-socket-dir-parent\") pod \"multus-j6rnj\" (UID: \"7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6\") " pod="openshift-multus/multus-j6rnj" Apr 21 03:57:37.802941 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.802636 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kmxxq\" (UID: \"cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b\") " pod="openshift-multus/multus-additional-cni-plugins-kmxxq" Apr 21 03:57:37.802941 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.802652 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b4a64573-596e-4e34-a0d2-ec31f17a6ba5-iptables-alerter-script\") pod \"iptables-alerter-7j4lb\" (UID: \"b4a64573-596e-4e34-a0d2-ec31f17a6ba5\") " pod="openshift-network-operator/iptables-alerter-7j4lb" Apr 21 03:57:37.802941 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.802680 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kmxxq\" (UID: \"cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b\") " pod="openshift-multus/multus-additional-cni-plugins-kmxxq" Apr 21 03:57:37.802941 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.802709 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6-system-cni-dir\") pod \"multus-j6rnj\" (UID: \"7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6\") " pod="openshift-multus/multus-j6rnj" Apr 21 03:57:37.802941 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.802729 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6-multus-cni-dir\") pod \"multus-j6rnj\" (UID: \"7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6\") " pod="openshift-multus/multus-j6rnj" Apr 21 03:57:37.802941 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.802773 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6-host-var-lib-cni-bin\") pod \"multus-j6rnj\" (UID: \"7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6\") " pod="openshift-multus/multus-j6rnj" Apr 21 03:57:37.802941 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.802795 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm5b4\" (UniqueName: \"kubernetes.io/projected/7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6-kube-api-access-lm5b4\") pod \"multus-j6rnj\" (UID: \"7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6\") " pod="openshift-multus/multus-j6rnj" Apr 21 03:57:37.802941 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.802821 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b4a64573-596e-4e34-a0d2-ec31f17a6ba5-host-slash\") pod \"iptables-alerter-7j4lb\" (UID: \"b4a64573-596e-4e34-a0d2-ec31f17a6ba5\") " pod="openshift-network-operator/iptables-alerter-7j4lb" Apr 21 03:57:37.802941 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.802864 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 03:57:37.803231 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.803054 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-zkdlh\"" Apr 21 03:57:37.803454 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.803439 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jxqfz" Apr 21 03:57:37.803674 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.803644 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 03:57:37.804145 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.804131 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 03:57:37.804439 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.804422 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 03:57:37.804521 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.804504 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-lh26x\"" Apr 21 03:57:37.804887 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.804871 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jpnv7" Apr 21 03:57:37.806378 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.806358 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 03:57:37.806378 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.806365 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 03:57:37.806510 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.806418 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-mplvf\"" Apr 21 03:57:37.806838 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.806824 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 03:57:37.807360 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.807344 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-bsm59\"" Apr 21 03:57:37.807704 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.807683 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 03:57:37.811977 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.811955 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 03:57:37.813342 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.813323 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 03:57:37.813457 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.813421 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-136.ec2.internal" Apr 21 03:57:37.813653 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.813628 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-136.ec2.internal"] Apr 21 03:57:37.815526 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.815509 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 03:57:37.820554 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.820536 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-134-136.ec2.internal"] Apr 21 03:57:37.820739 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.820727 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 03:57:37.832828 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.832808 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-5g62v" Apr 21 03:57:37.841044 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.841024 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-5g62v" Apr 21 03:57:37.898165 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.898145 2578 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 03:57:37.903742 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.903719 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6-cnibin\") pod \"multus-j6rnj\" (UID: \"7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6\") " pod="openshift-multus/multus-j6rnj" Apr 21 03:57:37.903833 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.903758 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76hn5\" (UniqueName: \"kubernetes.io/projected/f4acd3fa-d747-4053-9af8-c38066e122ab-kube-api-access-76hn5\") pod \"node-resolver-jxqfz\" (UID: \"f4acd3fa-d747-4053-9af8-c38066e122ab\") " pod="openshift-dns/node-resolver-jxqfz" Apr 21 03:57:37.903833 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.903781 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-ovnkube-config\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:37.903833 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.903796 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-env-overrides\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:37.903833 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.903820 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6-host-run-k8s-cni-cncf-io\") pod \"multus-j6rnj\" (UID: \"7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6\") " pod="openshift-multus/multus-j6rnj" Apr 21 03:57:37.904029 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.903846 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-ovnkube-script-lib\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:37.904029 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.903858 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6-cnibin\") pod \"multus-j6rnj\" (UID: \"7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6\") " pod="openshift-multus/multus-j6rnj" Apr 21 03:57:37.904029 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.903866 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smnbx\" (UniqueName: \"kubernetes.io/projected/30292a27-0318-4f49-b26c-f54654ac07db-kube-api-access-smnbx\") pod \"node-ca-jpnv7\" (UID: \"30292a27-0318-4f49-b26c-f54654ac07db\") " pod="openshift-image-registry/node-ca-jpnv7" Apr 21 03:57:37.904029 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.903881 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a2aaf4f5-f0bd-4a7f-911d-4030333f4bb7-socket-dir\") pod \"aws-ebs-csi-driver-node-648s8\" (UID: \"a2aaf4f5-f0bd-4a7f-911d-4030333f4bb7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648s8" Apr 21 03:57:37.904029 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.903897 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6-host-run-k8s-cni-cncf-io\") pod \"multus-j6rnj\" (UID: \"7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6\") " pod="openshift-multus/multus-j6rnj" Apr 21 03:57:37.904029 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.903901 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b-cni-binary-copy\") pod \"multus-additional-cni-plugins-kmxxq\" (UID: \"cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b\") " pod="openshift-multus/multus-additional-cni-plugins-kmxxq" Apr 21 03:57:37.904029 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.903941 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b-os-release\") pod \"multus-additional-cni-plugins-kmxxq\" (UID: \"cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b\") " pod="openshift-multus/multus-additional-cni-plugins-kmxxq" Apr 21 03:57:37.904029 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.903985 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6-multus-conf-dir\") pod \"multus-j6rnj\" (UID: \"7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6\") " pod="openshift-multus/multus-j6rnj" Apr 21 03:57:37.904029 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.904026 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b-os-release\") pod \"multus-additional-cni-plugins-kmxxq\" (UID: \"cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b\") " pod="openshift-multus/multus-additional-cni-plugins-kmxxq" Apr 21 03:57:37.904471 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.904045 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6-multus-conf-dir\") pod \"multus-j6rnj\" (UID: \"7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6\") " pod="openshift-multus/multus-j6rnj" Apr 21 03:57:37.904471 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.904090 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6-etc-kubernetes\") pod \"multus-j6rnj\" (UID: \"7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6\") " pod="openshift-multus/multus-j6rnj" Apr 21 03:57:37.904471 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.904138 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a2aaf4f5-f0bd-4a7f-911d-4030333f4bb7-kubelet-dir\") pod \"aws-ebs-csi-driver-node-648s8\" (UID: \"a2aaf4f5-f0bd-4a7f-911d-4030333f4bb7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648s8" Apr 21 03:57:37.904471 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.904164 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8jhrv\" (UniqueName: \"kubernetes.io/projected/b5d6aff7-e2e1-4646-9d82-8c931e49196e-kube-api-access-8jhrv\") pod \"network-check-target-n6d2m\" (UID: \"b5d6aff7-e2e1-4646-9d82-8c931e49196e\") " pod="openshift-network-diagnostics/network-check-target-n6d2m" Apr 21 03:57:37.904471 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.904190 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6-etc-kubernetes\") pod \"multus-j6rnj\" (UID: \"7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6\") " pod="openshift-multus/multus-j6rnj" Apr 21 03:57:37.904471 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.904196 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-host-cni-netd\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:37.904471 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.904220 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/adbc5b28-9686-4704-b170-0ada296e15b8-agent-certs\") pod \"konnectivity-agent-7vx69\" (UID: \"adbc5b28-9686-4704-b170-0ada296e15b8\") " pod="kube-system/konnectivity-agent-7vx69" Apr 21 03:57:37.904471 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.904236 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/adbc5b28-9686-4704-b170-0ada296e15b8-konnectivity-ca\") pod \"konnectivity-agent-7vx69\" (UID: \"adbc5b28-9686-4704-b170-0ada296e15b8\") " pod="kube-system/konnectivity-agent-7vx69" Apr 21 03:57:37.904471 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.904253 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bh8tm\" (UniqueName: \"kubernetes.io/projected/cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b-kube-api-access-bh8tm\") pod \"multus-additional-cni-plugins-kmxxq\" (UID: \"cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b\") " pod="openshift-multus/multus-additional-cni-plugins-kmxxq" Apr 21 03:57:37.904471 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.904272 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6-host-run-netns\") pod \"multus-j6rnj\" (UID: \"7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6\") " pod="openshift-multus/multus-j6rnj" Apr 21 03:57:37.904471 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.904305 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/30292a27-0318-4f49-b26c-f54654ac07db-host\") pod \"node-ca-jpnv7\" (UID: \"30292a27-0318-4f49-b26c-f54654ac07db\") " pod="openshift-image-registry/node-ca-jpnv7" Apr 21 03:57:37.904471 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.904329 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-host-kubelet\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:37.904471 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.904352 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a2aaf4f5-f0bd-4a7f-911d-4030333f4bb7-etc-selinux\") pod \"aws-ebs-csi-driver-node-648s8\" (UID: \"a2aaf4f5-f0bd-4a7f-911d-4030333f4bb7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648s8" Apr 21 03:57:37.904471 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.904423 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6de91b0f-af7b-43a5-8a43-cee7c5ba996e-etc-sysctl-conf\") pod \"tuned-x9r2c\" (UID: \"6de91b0f-af7b-43a5-8a43-cee7c5ba996e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9r2c" Apr 21 03:57:37.904471 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.904447 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6de91b0f-af7b-43a5-8a43-cee7c5ba996e-run\") pod \"tuned-x9r2c\" (UID: \"6de91b0f-af7b-43a5-8a43-cee7c5ba996e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9r2c" Apr 21 03:57:37.904471 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.904470 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6de91b0f-af7b-43a5-8a43-cee7c5ba996e-lib-modules\") pod \"tuned-x9r2c\" (UID: \"6de91b0f-af7b-43a5-8a43-cee7c5ba996e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9r2c" Apr 21 03:57:37.904471 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.904480 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6-host-run-netns\") pod \"multus-j6rnj\" (UID: \"7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6\") " pod="openshift-multus/multus-j6rnj" Apr 21 03:57:37.905247 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.904521 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6-multus-socket-dir-parent\") pod \"multus-j6rnj\" (UID: \"7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6\") " pod="openshift-multus/multus-j6rnj" Apr 21 03:57:37.905247 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.904553 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b4a64573-596e-4e34-a0d2-ec31f17a6ba5-iptables-alerter-script\") pod \"iptables-alerter-7j4lb\" (UID: \"b4a64573-596e-4e34-a0d2-ec31f17a6ba5\") " pod="openshift-network-operator/iptables-alerter-7j4lb" Apr 21 03:57:37.905247 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.904616 2578 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 03:57:37.905247 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.904651 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6-multus-socket-dir-parent\") pod \"multus-j6rnj\" (UID: \"7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6\") " pod="openshift-multus/multus-j6rnj" Apr 21 03:57:37.905247 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.904688 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kmxxq\" (UID: \"cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b\") " pod="openshift-multus/multus-additional-cni-plugins-kmxxq" Apr 21 03:57:37.905247 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.904893 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f4acd3fa-d747-4053-9af8-c38066e122ab-hosts-file\") pod \"node-resolver-jxqfz\" (UID: \"f4acd3fa-d747-4053-9af8-c38066e122ab\") " pod="openshift-dns/node-resolver-jxqfz" Apr 21 03:57:37.905247 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.904924 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-host-run-ovn-kubernetes\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:37.905247 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.904938 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/adbc5b28-9686-4704-b170-0ada296e15b8-konnectivity-ca\") pod \"konnectivity-agent-7vx69\" (UID: \"adbc5b28-9686-4704-b170-0ada296e15b8\") " pod="kube-system/konnectivity-agent-7vx69" Apr 21 03:57:37.905247 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.904954 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:37.905247 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.904982 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6de91b0f-af7b-43a5-8a43-cee7c5ba996e-etc-tuned\") pod \"tuned-x9r2c\" (UID: \"6de91b0f-af7b-43a5-8a43-cee7c5ba996e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9r2c" Apr 21 03:57:37.905247 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.905010 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-var-lib-openvswitch\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:37.905247 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.905034 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6de91b0f-af7b-43a5-8a43-cee7c5ba996e-etc-sysconfig\") pod \"tuned-x9r2c\" (UID: \"6de91b0f-af7b-43a5-8a43-cee7c5ba996e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9r2c" Apr 21 03:57:37.905247 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.905042 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b4a64573-596e-4e34-a0d2-ec31f17a6ba5-iptables-alerter-script\") pod \"iptables-alerter-7j4lb\" (UID: \"b4a64573-596e-4e34-a0d2-ec31f17a6ba5\") " pod="openshift-network-operator/iptables-alerter-7j4lb" Apr 21 03:57:37.905247 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.905059 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6de91b0f-af7b-43a5-8a43-cee7c5ba996e-etc-sysctl-d\") pod \"tuned-x9r2c\" (UID: \"6de91b0f-af7b-43a5-8a43-cee7c5ba996e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9r2c" Apr 21 03:57:37.905247 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.905090 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b4a64573-596e-4e34-a0d2-ec31f17a6ba5-host-slash\") pod \"iptables-alerter-7j4lb\" (UID: \"b4a64573-596e-4e34-a0d2-ec31f17a6ba5\") " pod="openshift-network-operator/iptables-alerter-7j4lb" Apr 21 03:57:37.905247 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.905114 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kmxxq\" (UID: \"cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b\") " pod="openshift-multus/multus-additional-cni-plugins-kmxxq" Apr 21 03:57:37.905247 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.905126 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b-system-cni-dir\") pod \"multus-additional-cni-plugins-kmxxq\" (UID: \"cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b\") " pod="openshift-multus/multus-additional-cni-plugins-kmxxq" Apr 21 03:57:37.905997 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.905135 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b4a64573-596e-4e34-a0d2-ec31f17a6ba5-host-slash\") pod \"iptables-alerter-7j4lb\" (UID: \"b4a64573-596e-4e34-a0d2-ec31f17a6ba5\") " pod="openshift-network-operator/iptables-alerter-7j4lb" Apr 21 03:57:37.905997 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.905167 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6-hostroot\") pod \"multus-j6rnj\" (UID: \"7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6\") " pod="openshift-multus/multus-j6rnj" Apr 21 03:57:37.905997 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.905183 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6-multus-daemon-config\") pod \"multus-j6rnj\" (UID: \"7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6\") " pod="openshift-multus/multus-j6rnj" Apr 21 03:57:37.905997 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.905230 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b-system-cni-dir\") pod \"multus-additional-cni-plugins-kmxxq\" (UID: \"cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b\") " pod="openshift-multus/multus-additional-cni-plugins-kmxxq" Apr 21 03:57:37.905997 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.905230 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6-hostroot\") pod \"multus-j6rnj\" (UID: \"7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6\") " pod="openshift-multus/multus-j6rnj" Apr 21 03:57:37.905997 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.905261 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7g22\" (UniqueName: \"kubernetes.io/projected/a2aaf4f5-f0bd-4a7f-911d-4030333f4bb7-kube-api-access-b7g22\") pod \"aws-ebs-csi-driver-node-648s8\" (UID: \"a2aaf4f5-f0bd-4a7f-911d-4030333f4bb7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648s8" Apr 21 03:57:37.905997 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.905321 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmvwp\" (UniqueName: \"kubernetes.io/projected/6de91b0f-af7b-43a5-8a43-cee7c5ba996e-kube-api-access-gmvwp\") pod \"tuned-x9r2c\" (UID: \"6de91b0f-af7b-43a5-8a43-cee7c5ba996e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9r2c" Apr 21 03:57:37.905997 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.905335 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b-cni-binary-copy\") pod \"multus-additional-cni-plugins-kmxxq\" (UID: \"cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b\") " pod="openshift-multus/multus-additional-cni-plugins-kmxxq" Apr 21 03:57:37.905997 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.905352 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fxfcv\" (UniqueName: \"kubernetes.io/projected/b4a64573-596e-4e34-a0d2-ec31f17a6ba5-kube-api-access-fxfcv\") pod \"iptables-alerter-7j4lb\" (UID: \"b4a64573-596e-4e34-a0d2-ec31f17a6ba5\") " pod="openshift-network-operator/iptables-alerter-7j4lb" Apr 21 03:57:37.905997 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.905383 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kmxxq\" (UID: \"cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b\") " pod="openshift-multus/multus-additional-cni-plugins-kmxxq" Apr 21 03:57:37.905997 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.905411 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f4acd3fa-d747-4053-9af8-c38066e122ab-tmp-dir\") pod \"node-resolver-jxqfz\" (UID: \"f4acd3fa-d747-4053-9af8-c38066e122ab\") " pod="openshift-dns/node-resolver-jxqfz" Apr 21 03:57:37.905997 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.905432 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-systemd-units\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:37.905997 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.905451 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6de91b0f-af7b-43a5-8a43-cee7c5ba996e-tmp\") pod \"tuned-x9r2c\" (UID: \"6de91b0f-af7b-43a5-8a43-cee7c5ba996e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9r2c" Apr 21 03:57:37.905997 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.905480 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6-host-var-lib-cni-multus\") pod \"multus-j6rnj\" (UID: \"7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6\") " pod="openshift-multus/multus-j6rnj" Apr 21 03:57:37.905997 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.905505 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5kdj9\" (UniqueName: \"kubernetes.io/projected/6e98b17f-4794-44aa-8756-58a9bd9cb37a-kube-api-access-5kdj9\") pod \"network-metrics-daemon-2lrq9\" (UID: \"6e98b17f-4794-44aa-8756-58a9bd9cb37a\") " pod="openshift-multus/network-metrics-daemon-2lrq9" Apr 21 03:57:37.905997 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.905528 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-node-log\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:37.905997 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.905549 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6de91b0f-af7b-43a5-8a43-cee7c5ba996e-etc-modprobe-d\") pod \"tuned-x9r2c\" (UID: \"6de91b0f-af7b-43a5-8a43-cee7c5ba996e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9r2c" Apr 21 03:57:37.906790 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.905581 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6-cni-binary-copy\") pod \"multus-j6rnj\" (UID: \"7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6\") " pod="openshift-multus/multus-j6rnj" Apr 21 03:57:37.906790 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.905609 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6-host-var-lib-cni-multus\") pod \"multus-j6rnj\" (UID: \"7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6\") " pod="openshift-multus/multus-j6rnj" Apr 21 03:57:37.906790 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.905612 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a2aaf4f5-f0bd-4a7f-911d-4030333f4bb7-device-dir\") pod \"aws-ebs-csi-driver-node-648s8\" (UID: \"a2aaf4f5-f0bd-4a7f-911d-4030333f4bb7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648s8" Apr 21 03:57:37.906790 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.905645 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6de91b0f-af7b-43a5-8a43-cee7c5ba996e-host\") pod \"tuned-x9r2c\" (UID: \"6de91b0f-af7b-43a5-8a43-cee7c5ba996e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9r2c" Apr 21 03:57:37.906790 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.905675 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6-host-var-lib-kubelet\") pod \"multus-j6rnj\" (UID: \"7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6\") " pod="openshift-multus/multus-j6rnj" Apr 21 03:57:37.906790 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.905677 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6-multus-daemon-config\") pod \"multus-j6rnj\" (UID: \"7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6\") " pod="openshift-multus/multus-j6rnj" Apr 21 03:57:37.906790 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.905721 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6e98b17f-4794-44aa-8756-58a9bd9cb37a-metrics-certs\") pod \"network-metrics-daemon-2lrq9\" (UID: \"6e98b17f-4794-44aa-8756-58a9bd9cb37a\") " pod="openshift-multus/network-metrics-daemon-2lrq9" Apr 21 03:57:37.906790 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.905748 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-host-slash\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:37.906790 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.905740 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6-host-var-lib-kubelet\") pod \"multus-j6rnj\" (UID: \"7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6\") " pod="openshift-multus/multus-j6rnj" Apr 21 03:57:37.906790 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.905773 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-host-run-netns\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:37.906790 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.905817 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-run-systemd\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:37.906790 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.905844 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-run-ovn\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:37.906790 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.905877 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kmxxq\" (UID: \"cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b\") " pod="openshift-multus/multus-additional-cni-plugins-kmxxq" Apr 21 03:57:37.906790 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.905892 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6de91b0f-af7b-43a5-8a43-cee7c5ba996e-etc-kubernetes\") pod \"tuned-x9r2c\" (UID: \"6de91b0f-af7b-43a5-8a43-cee7c5ba996e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9r2c" Apr 21 03:57:37.906790 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:37.905920 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:37.906790 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.905928 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6de91b0f-af7b-43a5-8a43-cee7c5ba996e-var-lib-kubelet\") pod \"tuned-x9r2c\" (UID: \"6de91b0f-af7b-43a5-8a43-cee7c5ba996e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9r2c" Apr 21 03:57:37.906790 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.905973 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b-cnibin\") pod \"multus-additional-cni-plugins-kmxxq\" (UID: \"cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b\") " pod="openshift-multus/multus-additional-cni-plugins-kmxxq" Apr 21 03:57:37.907246 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:37.905996 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e98b17f-4794-44aa-8756-58a9bd9cb37a-metrics-certs podName:6e98b17f-4794-44aa-8756-58a9bd9cb37a nodeName:}" failed. No retries permitted until 2026-04-21 03:57:38.405969536 +0000 UTC m=+2.075408589 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6e98b17f-4794-44aa-8756-58a9bd9cb37a-metrics-certs") pod "network-metrics-daemon-2lrq9" (UID: "6e98b17f-4794-44aa-8756-58a9bd9cb37a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:37.907246 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.906007 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b-cnibin\") pod \"multus-additional-cni-plugins-kmxxq\" (UID: \"cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b\") " pod="openshift-multus/multus-additional-cni-plugins-kmxxq" Apr 21 03:57:37.907246 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.906024 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6-cni-binary-copy\") pod \"multus-j6rnj\" (UID: \"7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6\") " pod="openshift-multus/multus-j6rnj" Apr 21 03:57:37.907246 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.906051 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6-host-run-multus-certs\") pod \"multus-j6rnj\" (UID: \"7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6\") " pod="openshift-multus/multus-j6rnj" Apr 21 03:57:37.907246 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.906083 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-run-openvswitch\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:37.907246 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.906093 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6-host-run-multus-certs\") pod \"multus-j6rnj\" (UID: \"7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6\") " pod="openshift-multus/multus-j6rnj" Apr 21 03:57:37.907246 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.906126 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-log-socket\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:37.907246 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.906148 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-host-cni-bin\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:37.907246 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.906166 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6de91b0f-af7b-43a5-8a43-cee7c5ba996e-sys\") pod \"tuned-x9r2c\" (UID: \"6de91b0f-af7b-43a5-8a43-cee7c5ba996e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9r2c" Apr 21 03:57:37.907246 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.906198 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6-os-release\") pod \"multus-j6rnj\" (UID: \"7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6\") " pod="openshift-multus/multus-j6rnj" Apr 21 03:57:37.907246 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.906223 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kmxxq\" (UID: \"cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b\") " pod="openshift-multus/multus-additional-cni-plugins-kmxxq" Apr 21 03:57:37.907246 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.906239 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6-system-cni-dir\") pod \"multus-j6rnj\" (UID: \"7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6\") " pod="openshift-multus/multus-j6rnj" Apr 21 03:57:37.907246 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.906261 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6-multus-cni-dir\") pod \"multus-j6rnj\" (UID: \"7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6\") " pod="openshift-multus/multus-j6rnj" Apr 21 03:57:37.907246 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.906302 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-etc-openvswitch\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:37.907246 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.906333 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6-os-release\") pod \"multus-j6rnj\" (UID: \"7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6\") " pod="openshift-multus/multus-j6rnj" Apr 21 03:57:37.907246 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.906326 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-ovn-node-metrics-cert\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:37.907246 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.906377 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6de91b0f-af7b-43a5-8a43-cee7c5ba996e-etc-systemd\") pod \"tuned-x9r2c\" (UID: \"6de91b0f-af7b-43a5-8a43-cee7c5ba996e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9r2c" Apr 21 03:57:37.907820 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.906394 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kmxxq\" (UID: \"cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b\") " pod="openshift-multus/multus-additional-cni-plugins-kmxxq" Apr 21 03:57:37.907820 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.906396 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6-system-cni-dir\") pod \"multus-j6rnj\" (UID: \"7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6\") " pod="openshift-multus/multus-j6rnj" Apr 21 03:57:37.907820 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.906426 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6-multus-cni-dir\") pod \"multus-j6rnj\" (UID: \"7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6\") " pod="openshift-multus/multus-j6rnj" Apr 21 03:57:37.907820 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.906452 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6-host-var-lib-cni-bin\") pod \"multus-j6rnj\" (UID: \"7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6\") " pod="openshift-multus/multus-j6rnj" Apr 21 03:57:37.907820 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.906487 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lm5b4\" (UniqueName: \"kubernetes.io/projected/7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6-kube-api-access-lm5b4\") pod \"multus-j6rnj\" (UID: \"7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6\") " pod="openshift-multus/multus-j6rnj" Apr 21 03:57:37.907820 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.906513 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6-host-var-lib-cni-bin\") pod \"multus-j6rnj\" (UID: \"7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6\") " pod="openshift-multus/multus-j6rnj" Apr 21 03:57:37.907820 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.906519 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmsft\" (UniqueName: \"kubernetes.io/projected/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-kube-api-access-nmsft\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:37.907820 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.906553 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/30292a27-0318-4f49-b26c-f54654ac07db-serviceca\") pod \"node-ca-jpnv7\" (UID: \"30292a27-0318-4f49-b26c-f54654ac07db\") " pod="openshift-image-registry/node-ca-jpnv7" Apr 21 03:57:37.907820 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.906581 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a2aaf4f5-f0bd-4a7f-911d-4030333f4bb7-registration-dir\") pod \"aws-ebs-csi-driver-node-648s8\" (UID: \"a2aaf4f5-f0bd-4a7f-911d-4030333f4bb7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648s8" Apr 21 03:57:37.907820 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.906606 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a2aaf4f5-f0bd-4a7f-911d-4030333f4bb7-sys-fs\") pod \"aws-ebs-csi-driver-node-648s8\" (UID: \"a2aaf4f5-f0bd-4a7f-911d-4030333f4bb7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648s8" Apr 21 03:57:37.908569 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.908549 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/adbc5b28-9686-4704-b170-0ada296e15b8-agent-certs\") pod \"konnectivity-agent-7vx69\" (UID: \"adbc5b28-9686-4704-b170-0ada296e15b8\") " pod="kube-system/konnectivity-agent-7vx69" Apr 21 03:57:37.910064 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:37.910031 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 03:57:37.910064 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:37.910057 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 03:57:37.910227 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:37.910071 2578 projected.go:194] Error preparing data for projected volume kube-api-access-8jhrv for pod openshift-network-diagnostics/network-check-target-n6d2m: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:37.910227 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:37.910126 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b5d6aff7-e2e1-4646-9d82-8c931e49196e-kube-api-access-8jhrv podName:b5d6aff7-e2e1-4646-9d82-8c931e49196e nodeName:}" failed. No retries permitted until 2026-04-21 03:57:38.410106836 +0000 UTC m=+2.079545911 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-8jhrv" (UniqueName: "kubernetes.io/projected/b5d6aff7-e2e1-4646-9d82-8c931e49196e-kube-api-access-8jhrv") pod "network-check-target-n6d2m" (UID: "b5d6aff7-e2e1-4646-9d82-8c931e49196e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:37.911926 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.911898 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bh8tm\" (UniqueName: \"kubernetes.io/projected/cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b-kube-api-access-bh8tm\") pod \"multus-additional-cni-plugins-kmxxq\" (UID: \"cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b\") " pod="openshift-multus/multus-additional-cni-plugins-kmxxq" Apr 21 03:57:37.912859 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.912820 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxfcv\" (UniqueName: \"kubernetes.io/projected/b4a64573-596e-4e34-a0d2-ec31f17a6ba5-kube-api-access-fxfcv\") pod \"iptables-alerter-7j4lb\" (UID: \"b4a64573-596e-4e34-a0d2-ec31f17a6ba5\") " pod="openshift-network-operator/iptables-alerter-7j4lb" Apr 21 03:57:37.913793 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.913776 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kdj9\" (UniqueName: \"kubernetes.io/projected/6e98b17f-4794-44aa-8756-58a9bd9cb37a-kube-api-access-5kdj9\") pod \"network-metrics-daemon-2lrq9\" (UID: \"6e98b17f-4794-44aa-8756-58a9bd9cb37a\") " pod="openshift-multus/network-metrics-daemon-2lrq9" Apr 21 03:57:37.914307 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.914293 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm5b4\" (UniqueName: \"kubernetes.io/projected/7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6-kube-api-access-lm5b4\") pod \"multus-j6rnj\" (UID: \"7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6\") " pod="openshift-multus/multus-j6rnj" Apr 21 03:57:37.936974 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:37.936945 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d94c65271f325dffc877a062645ae79.slice/crio-49c3d9626c415aacb71de1b5baa232f250d3f3d2e4d69452baa20da1783e0b58 WatchSource:0}: Error finding container 49c3d9626c415aacb71de1b5baa232f250d3f3d2e4d69452baa20da1783e0b58: Status 404 returned error can't find the container with id 49c3d9626c415aacb71de1b5baa232f250d3f3d2e4d69452baa20da1783e0b58 Apr 21 03:57:37.937319 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:37.937271 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68f04465e53eef24d16ecd9de5ad5a12.slice/crio-d4faf730d5383873805ec51aaec8fe9a6f4f227675a4086a4a7537896e9eb88a WatchSource:0}: Error finding container d4faf730d5383873805ec51aaec8fe9a6f4f227675a4086a4a7537896e9eb88a: Status 404 returned error can't find the container with id d4faf730d5383873805ec51aaec8fe9a6f4f227675a4086a4a7537896e9eb88a Apr 21 03:57:37.942346 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:37.942325 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 03:57:38.007497 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.007463 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-ovnkube-script-lib\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:38.007497 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.007493 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-smnbx\" (UniqueName: \"kubernetes.io/projected/30292a27-0318-4f49-b26c-f54654ac07db-kube-api-access-smnbx\") pod \"node-ca-jpnv7\" (UID: \"30292a27-0318-4f49-b26c-f54654ac07db\") " pod="openshift-image-registry/node-ca-jpnv7" Apr 21 03:57:38.007724 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.007509 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a2aaf4f5-f0bd-4a7f-911d-4030333f4bb7-socket-dir\") pod \"aws-ebs-csi-driver-node-648s8\" (UID: \"a2aaf4f5-f0bd-4a7f-911d-4030333f4bb7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648s8" Apr 21 03:57:38.007724 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.007527 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a2aaf4f5-f0bd-4a7f-911d-4030333f4bb7-kubelet-dir\") pod \"aws-ebs-csi-driver-node-648s8\" (UID: \"a2aaf4f5-f0bd-4a7f-911d-4030333f4bb7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648s8" Apr 21 03:57:38.007724 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.007560 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-host-cni-netd\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:38.007724 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.007585 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/30292a27-0318-4f49-b26c-f54654ac07db-host\") pod \"node-ca-jpnv7\" (UID: \"30292a27-0318-4f49-b26c-f54654ac07db\") " pod="openshift-image-registry/node-ca-jpnv7" Apr 21 03:57:38.007724 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.007607 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-host-kubelet\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:38.007724 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.007628 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a2aaf4f5-f0bd-4a7f-911d-4030333f4bb7-etc-selinux\") pod \"aws-ebs-csi-driver-node-648s8\" (UID: \"a2aaf4f5-f0bd-4a7f-911d-4030333f4bb7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648s8" Apr 21 03:57:38.007724 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.007633 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a2aaf4f5-f0bd-4a7f-911d-4030333f4bb7-kubelet-dir\") pod \"aws-ebs-csi-driver-node-648s8\" (UID: \"a2aaf4f5-f0bd-4a7f-911d-4030333f4bb7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648s8" Apr 21 03:57:38.007724 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.007648 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6de91b0f-af7b-43a5-8a43-cee7c5ba996e-etc-sysctl-conf\") pod \"tuned-x9r2c\" (UID: \"6de91b0f-af7b-43a5-8a43-cee7c5ba996e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9r2c" Apr 21 03:57:38.007724 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.007671 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6de91b0f-af7b-43a5-8a43-cee7c5ba996e-run\") pod \"tuned-x9r2c\" (UID: \"6de91b0f-af7b-43a5-8a43-cee7c5ba996e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9r2c" Apr 21 03:57:38.007724 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.007635 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-host-cni-netd\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:38.007724 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.007694 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6de91b0f-af7b-43a5-8a43-cee7c5ba996e-lib-modules\") pod \"tuned-x9r2c\" (UID: \"6de91b0f-af7b-43a5-8a43-cee7c5ba996e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9r2c" Apr 21 03:57:38.007724 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.007694 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a2aaf4f5-f0bd-4a7f-911d-4030333f4bb7-socket-dir\") pod \"aws-ebs-csi-driver-node-648s8\" (UID: \"a2aaf4f5-f0bd-4a7f-911d-4030333f4bb7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648s8" Apr 21 03:57:38.007724 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.007693 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/30292a27-0318-4f49-b26c-f54654ac07db-host\") pod \"node-ca-jpnv7\" (UID: \"30292a27-0318-4f49-b26c-f54654ac07db\") " pod="openshift-image-registry/node-ca-jpnv7" Apr 21 03:57:38.007724 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.007723 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-host-kubelet\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:38.008232 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.007750 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6de91b0f-af7b-43a5-8a43-cee7c5ba996e-run\") pod \"tuned-x9r2c\" (UID: \"6de91b0f-af7b-43a5-8a43-cee7c5ba996e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9r2c" Apr 21 03:57:38.008232 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.007755 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a2aaf4f5-f0bd-4a7f-911d-4030333f4bb7-etc-selinux\") pod \"aws-ebs-csi-driver-node-648s8\" (UID: \"a2aaf4f5-f0bd-4a7f-911d-4030333f4bb7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648s8" Apr 21 03:57:38.008232 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.007772 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f4acd3fa-d747-4053-9af8-c38066e122ab-hosts-file\") pod \"node-resolver-jxqfz\" (UID: \"f4acd3fa-d747-4053-9af8-c38066e122ab\") " pod="openshift-dns/node-resolver-jxqfz" Apr 21 03:57:38.008232 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.007725 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f4acd3fa-d747-4053-9af8-c38066e122ab-hosts-file\") pod \"node-resolver-jxqfz\" (UID: \"f4acd3fa-d747-4053-9af8-c38066e122ab\") " pod="openshift-dns/node-resolver-jxqfz" Apr 21 03:57:38.008232 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.007811 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-host-run-ovn-kubernetes\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:38.008232 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.007822 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6de91b0f-af7b-43a5-8a43-cee7c5ba996e-etc-sysctl-conf\") pod \"tuned-x9r2c\" (UID: \"6de91b0f-af7b-43a5-8a43-cee7c5ba996e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9r2c" Apr 21 03:57:38.008232 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.007839 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6de91b0f-af7b-43a5-8a43-cee7c5ba996e-lib-modules\") pod \"tuned-x9r2c\" (UID: \"6de91b0f-af7b-43a5-8a43-cee7c5ba996e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9r2c" Apr 21 03:57:38.008232 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.007839 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:38.008232 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.007866 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6de91b0f-af7b-43a5-8a43-cee7c5ba996e-etc-tuned\") pod \"tuned-x9r2c\" (UID: \"6de91b0f-af7b-43a5-8a43-cee7c5ba996e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9r2c" Apr 21 03:57:38.008232 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.007884 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-host-run-ovn-kubernetes\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:38.008232 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.007888 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-var-lib-openvswitch\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:38.008232 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.007899 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:38.008232 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.007921 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6de91b0f-af7b-43a5-8a43-cee7c5ba996e-etc-sysconfig\") pod \"tuned-x9r2c\" (UID: \"6de91b0f-af7b-43a5-8a43-cee7c5ba996e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9r2c" Apr 21 03:57:38.008232 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.007945 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6de91b0f-af7b-43a5-8a43-cee7c5ba996e-etc-sysctl-d\") pod \"tuned-x9r2c\" (UID: \"6de91b0f-af7b-43a5-8a43-cee7c5ba996e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9r2c" Apr 21 03:57:38.008232 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.007966 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6de91b0f-af7b-43a5-8a43-cee7c5ba996e-etc-sysconfig\") pod \"tuned-x9r2c\" (UID: \"6de91b0f-af7b-43a5-8a43-cee7c5ba996e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9r2c" Apr 21 03:57:38.008232 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.007975 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b7g22\" (UniqueName: \"kubernetes.io/projected/a2aaf4f5-f0bd-4a7f-911d-4030333f4bb7-kube-api-access-b7g22\") pod \"aws-ebs-csi-driver-node-648s8\" (UID: \"a2aaf4f5-f0bd-4a7f-911d-4030333f4bb7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648s8" Apr 21 03:57:38.008232 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.007922 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-var-lib-openvswitch\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:38.009011 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.008007 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gmvwp\" (UniqueName: \"kubernetes.io/projected/6de91b0f-af7b-43a5-8a43-cee7c5ba996e-kube-api-access-gmvwp\") pod \"tuned-x9r2c\" (UID: \"6de91b0f-af7b-43a5-8a43-cee7c5ba996e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9r2c" Apr 21 03:57:38.009011 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.008033 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f4acd3fa-d747-4053-9af8-c38066e122ab-tmp-dir\") pod \"node-resolver-jxqfz\" (UID: \"f4acd3fa-d747-4053-9af8-c38066e122ab\") " pod="openshift-dns/node-resolver-jxqfz" Apr 21 03:57:38.009011 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.008054 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6de91b0f-af7b-43a5-8a43-cee7c5ba996e-etc-sysctl-d\") pod \"tuned-x9r2c\" (UID: \"6de91b0f-af7b-43a5-8a43-cee7c5ba996e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9r2c" Apr 21 03:57:38.009011 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.008057 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-systemd-units\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:38.009011 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.008085 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6de91b0f-af7b-43a5-8a43-cee7c5ba996e-tmp\") pod \"tuned-x9r2c\" (UID: \"6de91b0f-af7b-43a5-8a43-cee7c5ba996e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9r2c" Apr 21 03:57:38.009011 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.008089 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-systemd-units\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:38.009011 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.008102 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-node-log\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:38.009011 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.008105 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-ovnkube-script-lib\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:38.009011 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.008119 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6de91b0f-af7b-43a5-8a43-cee7c5ba996e-etc-modprobe-d\") pod \"tuned-x9r2c\" (UID: \"6de91b0f-af7b-43a5-8a43-cee7c5ba996e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9r2c" Apr 21 03:57:38.009011 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.008148 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a2aaf4f5-f0bd-4a7f-911d-4030333f4bb7-device-dir\") pod \"aws-ebs-csi-driver-node-648s8\" (UID: \"a2aaf4f5-f0bd-4a7f-911d-4030333f4bb7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648s8" Apr 21 03:57:38.009011 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.008163 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-node-log\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:38.009011 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.008171 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6de91b0f-af7b-43a5-8a43-cee7c5ba996e-host\") pod \"tuned-x9r2c\" (UID: \"6de91b0f-af7b-43a5-8a43-cee7c5ba996e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9r2c" Apr 21 03:57:38.009011 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.008214 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6de91b0f-af7b-43a5-8a43-cee7c5ba996e-host\") pod \"tuned-x9r2c\" (UID: \"6de91b0f-af7b-43a5-8a43-cee7c5ba996e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9r2c" Apr 21 03:57:38.009011 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.008234 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-host-slash\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:38.009011 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.008259 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-host-run-netns\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:38.009011 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.008265 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a2aaf4f5-f0bd-4a7f-911d-4030333f4bb7-device-dir\") pod \"aws-ebs-csi-driver-node-648s8\" (UID: \"a2aaf4f5-f0bd-4a7f-911d-4030333f4bb7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648s8" Apr 21 03:57:38.009011 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.008314 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-host-slash\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:38.009011 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.008323 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-run-systemd\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:38.009650 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.008327 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6de91b0f-af7b-43a5-8a43-cee7c5ba996e-etc-modprobe-d\") pod \"tuned-x9r2c\" (UID: \"6de91b0f-af7b-43a5-8a43-cee7c5ba996e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9r2c" Apr 21 03:57:38.009650 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.008351 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-run-systemd\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:38.009650 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.008356 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-run-ovn\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:38.009650 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.008380 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6de91b0f-af7b-43a5-8a43-cee7c5ba996e-etc-kubernetes\") pod \"tuned-x9r2c\" (UID: \"6de91b0f-af7b-43a5-8a43-cee7c5ba996e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9r2c" Apr 21 03:57:38.009650 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.008353 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f4acd3fa-d747-4053-9af8-c38066e122ab-tmp-dir\") pod \"node-resolver-jxqfz\" (UID: \"f4acd3fa-d747-4053-9af8-c38066e122ab\") " pod="openshift-dns/node-resolver-jxqfz" Apr 21 03:57:38.009650 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.008393 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-run-ovn\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:38.009650 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.008401 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6de91b0f-af7b-43a5-8a43-cee7c5ba996e-var-lib-kubelet\") pod \"tuned-x9r2c\" (UID: \"6de91b0f-af7b-43a5-8a43-cee7c5ba996e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9r2c" Apr 21 03:57:38.009650 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.008356 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-host-run-netns\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:38.009650 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.008436 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-run-openvswitch\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:38.009650 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.008443 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6de91b0f-af7b-43a5-8a43-cee7c5ba996e-etc-kubernetes\") pod \"tuned-x9r2c\" (UID: \"6de91b0f-af7b-43a5-8a43-cee7c5ba996e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9r2c" Apr 21 03:57:38.009650 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.008461 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-log-socket\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:38.009650 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.008467 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-run-openvswitch\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:38.009650 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.008485 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-host-cni-bin\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:38.009650 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.008491 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6de91b0f-af7b-43a5-8a43-cee7c5ba996e-var-lib-kubelet\") pod \"tuned-x9r2c\" (UID: \"6de91b0f-af7b-43a5-8a43-cee7c5ba996e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9r2c" Apr 21 03:57:38.009650 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.008509 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6de91b0f-af7b-43a5-8a43-cee7c5ba996e-sys\") pod \"tuned-x9r2c\" (UID: \"6de91b0f-af7b-43a5-8a43-cee7c5ba996e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9r2c" Apr 21 03:57:38.009650 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.008522 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-log-socket\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:38.009650 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.008530 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-host-cni-bin\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:38.009650 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.008537 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-etc-openvswitch\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:38.010140 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.008564 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6de91b0f-af7b-43a5-8a43-cee7c5ba996e-sys\") pod \"tuned-x9r2c\" (UID: \"6de91b0f-af7b-43a5-8a43-cee7c5ba996e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9r2c" Apr 21 03:57:38.010140 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.008572 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-ovn-node-metrics-cert\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:38.010140 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.008584 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-etc-openvswitch\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:38.010140 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.008597 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6de91b0f-af7b-43a5-8a43-cee7c5ba996e-etc-systemd\") pod \"tuned-x9r2c\" (UID: \"6de91b0f-af7b-43a5-8a43-cee7c5ba996e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9r2c" Apr 21 03:57:38.010140 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.008623 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nmsft\" (UniqueName: \"kubernetes.io/projected/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-kube-api-access-nmsft\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:38.010140 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.008646 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/30292a27-0318-4f49-b26c-f54654ac07db-serviceca\") pod \"node-ca-jpnv7\" (UID: \"30292a27-0318-4f49-b26c-f54654ac07db\") " pod="openshift-image-registry/node-ca-jpnv7" Apr 21 03:57:38.010140 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.008649 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6de91b0f-af7b-43a5-8a43-cee7c5ba996e-etc-systemd\") pod \"tuned-x9r2c\" (UID: \"6de91b0f-af7b-43a5-8a43-cee7c5ba996e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9r2c" Apr 21 03:57:38.010140 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.008669 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a2aaf4f5-f0bd-4a7f-911d-4030333f4bb7-registration-dir\") pod \"aws-ebs-csi-driver-node-648s8\" (UID: \"a2aaf4f5-f0bd-4a7f-911d-4030333f4bb7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648s8" Apr 21 03:57:38.010140 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.008694 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a2aaf4f5-f0bd-4a7f-911d-4030333f4bb7-sys-fs\") pod \"aws-ebs-csi-driver-node-648s8\" (UID: \"a2aaf4f5-f0bd-4a7f-911d-4030333f4bb7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648s8" Apr 21 03:57:38.010140 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.008717 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-76hn5\" (UniqueName: \"kubernetes.io/projected/f4acd3fa-d747-4053-9af8-c38066e122ab-kube-api-access-76hn5\") pod \"node-resolver-jxqfz\" (UID: \"f4acd3fa-d747-4053-9af8-c38066e122ab\") " pod="openshift-dns/node-resolver-jxqfz" Apr 21 03:57:38.010140 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.008727 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a2aaf4f5-f0bd-4a7f-911d-4030333f4bb7-registration-dir\") pod \"aws-ebs-csi-driver-node-648s8\" (UID: \"a2aaf4f5-f0bd-4a7f-911d-4030333f4bb7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648s8" Apr 21 03:57:38.010140 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.008743 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-ovnkube-config\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:38.010140 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.008795 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a2aaf4f5-f0bd-4a7f-911d-4030333f4bb7-sys-fs\") pod \"aws-ebs-csi-driver-node-648s8\" (UID: \"a2aaf4f5-f0bd-4a7f-911d-4030333f4bb7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648s8" Apr 21 03:57:38.010140 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.008824 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-env-overrides\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:38.010140 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.009080 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/30292a27-0318-4f49-b26c-f54654ac07db-serviceca\") pod \"node-ca-jpnv7\" (UID: \"30292a27-0318-4f49-b26c-f54654ac07db\") " pod="openshift-image-registry/node-ca-jpnv7" Apr 21 03:57:38.010140 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.009321 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-env-overrides\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:38.010140 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.009786 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-ovnkube-config\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:38.010675 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.010559 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6de91b0f-af7b-43a5-8a43-cee7c5ba996e-etc-tuned\") pod \"tuned-x9r2c\" (UID: \"6de91b0f-af7b-43a5-8a43-cee7c5ba996e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9r2c" Apr 21 03:57:38.010675 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.010583 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6de91b0f-af7b-43a5-8a43-cee7c5ba996e-tmp\") pod \"tuned-x9r2c\" (UID: \"6de91b0f-af7b-43a5-8a43-cee7c5ba996e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9r2c" Apr 21 03:57:38.010675 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.010653 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-ovn-node-metrics-cert\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:38.016207 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.016186 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7g22\" (UniqueName: \"kubernetes.io/projected/a2aaf4f5-f0bd-4a7f-911d-4030333f4bb7-kube-api-access-b7g22\") pod \"aws-ebs-csi-driver-node-648s8\" (UID: \"a2aaf4f5-f0bd-4a7f-911d-4030333f4bb7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648s8" Apr 21 03:57:38.016384 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.016354 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-76hn5\" (UniqueName: \"kubernetes.io/projected/f4acd3fa-d747-4053-9af8-c38066e122ab-kube-api-access-76hn5\") pod \"node-resolver-jxqfz\" (UID: \"f4acd3fa-d747-4053-9af8-c38066e122ab\") " pod="openshift-dns/node-resolver-jxqfz" Apr 21 03:57:38.016458 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.016430 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmvwp\" (UniqueName: \"kubernetes.io/projected/6de91b0f-af7b-43a5-8a43-cee7c5ba996e-kube-api-access-gmvwp\") pod \"tuned-x9r2c\" (UID: \"6de91b0f-af7b-43a5-8a43-cee7c5ba996e\") " pod="openshift-cluster-node-tuning-operator/tuned-x9r2c" Apr 21 03:57:38.018161 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.018142 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-smnbx\" (UniqueName: \"kubernetes.io/projected/30292a27-0318-4f49-b26c-f54654ac07db-kube-api-access-smnbx\") pod \"node-ca-jpnv7\" (UID: \"30292a27-0318-4f49-b26c-f54654ac07db\") " pod="openshift-image-registry/node-ca-jpnv7" Apr 21 03:57:38.018225 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.018186 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmsft\" (UniqueName: \"kubernetes.io/projected/37b63820-2bf0-4a5b-82c5-6b56ab7689b7-kube-api-access-nmsft\") pod \"ovnkube-node-82wml\" (UID: \"37b63820-2bf0-4a5b-82c5-6b56ab7689b7\") " pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:38.127444 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.127358 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-j6rnj" Apr 21 03:57:38.133307 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:38.133263 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b9f5e8e_4f2f_4d07_87f7_ef56b124e3a6.slice/crio-d19df4999377385f5e5a2bb3a0c46e0f17799c14b7c73ecb94c9aba61e7a5272 WatchSource:0}: Error finding container d19df4999377385f5e5a2bb3a0c46e0f17799c14b7c73ecb94c9aba61e7a5272: Status 404 returned error can't find the container with id d19df4999377385f5e5a2bb3a0c46e0f17799c14b7c73ecb94c9aba61e7a5272 Apr 21 03:57:38.143269 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.143251 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-7j4lb" Apr 21 03:57:38.149603 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:38.149574 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4a64573_596e_4e34_a0d2_ec31f17a6ba5.slice/crio-6fa5e0b2deaee247b45ba0ad23fe71fb4f6b7498d45c917d63b5d1f35b11bafa WatchSource:0}: Error finding container 6fa5e0b2deaee247b45ba0ad23fe71fb4f6b7498d45c917d63b5d1f35b11bafa: Status 404 returned error can't find the container with id 6fa5e0b2deaee247b45ba0ad23fe71fb4f6b7498d45c917d63b5d1f35b11bafa Apr 21 03:57:38.158007 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.157987 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-7vx69" Apr 21 03:57:38.173376 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.173353 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kmxxq" Apr 21 03:57:38.180018 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:38.179993 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd7df76b_5df8_4dcb_8e6c_8e98f5533e3b.slice/crio-d524d0e8c69c5508730ece263a9e6876c559b644b684033ac5828d9336686619 WatchSource:0}: Error finding container d524d0e8c69c5508730ece263a9e6876c559b644b684033ac5828d9336686619: Status 404 returned error can't find the container with id d524d0e8c69c5508730ece263a9e6876c559b644b684033ac5828d9336686619 Apr 21 03:57:38.190894 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.190872 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:57:38.197018 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:38.196995 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37b63820_2bf0_4a5b_82c5_6b56ab7689b7.slice/crio-085ad2311f3e4bb54c5b6bb791b9f129feff6bc041713c8fe39d2e841748bcf4 WatchSource:0}: Error finding container 085ad2311f3e4bb54c5b6bb791b9f129feff6bc041713c8fe39d2e841748bcf4: Status 404 returned error can't find the container with id 085ad2311f3e4bb54c5b6bb791b9f129feff6bc041713c8fe39d2e841748bcf4 Apr 21 03:57:38.207359 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.207342 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648s8" Apr 21 03:57:38.213569 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:38.213546 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2aaf4f5_f0bd_4a7f_911d_4030333f4bb7.slice/crio-2eb8b4d1eb7137cb9284ad4b48a7bec039ff876cd1cb9d2923bf6f7b5406aee4 WatchSource:0}: Error finding container 2eb8b4d1eb7137cb9284ad4b48a7bec039ff876cd1cb9d2923bf6f7b5406aee4: Status 404 returned error can't find the container with id 2eb8b4d1eb7137cb9284ad4b48a7bec039ff876cd1cb9d2923bf6f7b5406aee4 Apr 21 03:57:38.214966 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.214948 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 03:57:38.233376 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.233353 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-x9r2c" Apr 21 03:57:38.240327 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.240098 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jxqfz" Apr 21 03:57:38.241503 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:38.241479 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6de91b0f_af7b_43a5_8a43_cee7c5ba996e.slice/crio-8995406e13c69a7b897ea6ae7c88aaf030dc87bfdf99b64e427af0e9d0208bb6 WatchSource:0}: Error finding container 8995406e13c69a7b897ea6ae7c88aaf030dc87bfdf99b64e427af0e9d0208bb6: Status 404 returned error can't find the container with id 8995406e13c69a7b897ea6ae7c88aaf030dc87bfdf99b64e427af0e9d0208bb6 Apr 21 03:57:38.245078 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.245008 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jpnv7" Apr 21 03:57:38.247336 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:38.247316 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4acd3fa_d747_4053_9af8_c38066e122ab.slice/crio-e0b734a89e9f2cddaf65a1c56fcbced70f7b306c7d002fdcd605da8e257e26f1 WatchSource:0}: Error finding container e0b734a89e9f2cddaf65a1c56fcbced70f7b306c7d002fdcd605da8e257e26f1: Status 404 returned error can't find the container with id e0b734a89e9f2cddaf65a1c56fcbced70f7b306c7d002fdcd605da8e257e26f1 Apr 21 03:57:38.251296 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:57:38.251264 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30292a27_0318_4f49_b26c_f54654ac07db.slice/crio-596801bebe6aebb916d4052db708db08d9453bac9147f542f26f913cd2aae12b WatchSource:0}: Error finding container 596801bebe6aebb916d4052db708db08d9453bac9147f542f26f913cd2aae12b: Status 404 returned error can't find the container with id 596801bebe6aebb916d4052db708db08d9453bac9147f542f26f913cd2aae12b Apr 21 03:57:38.411783 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.411703 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8jhrv\" (UniqueName: \"kubernetes.io/projected/b5d6aff7-e2e1-4646-9d82-8c931e49196e-kube-api-access-8jhrv\") pod \"network-check-target-n6d2m\" (UID: \"b5d6aff7-e2e1-4646-9d82-8c931e49196e\") " pod="openshift-network-diagnostics/network-check-target-n6d2m" Apr 21 03:57:38.411941 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.411803 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6e98b17f-4794-44aa-8756-58a9bd9cb37a-metrics-certs\") pod \"network-metrics-daemon-2lrq9\" (UID: \"6e98b17f-4794-44aa-8756-58a9bd9cb37a\") " pod="openshift-multus/network-metrics-daemon-2lrq9" Apr 21 03:57:38.411941 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:38.411927 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:38.412055 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:38.411982 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e98b17f-4794-44aa-8756-58a9bd9cb37a-metrics-certs podName:6e98b17f-4794-44aa-8756-58a9bd9cb37a nodeName:}" failed. No retries permitted until 2026-04-21 03:57:39.411964038 +0000 UTC m=+3.081403104 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6e98b17f-4794-44aa-8756-58a9bd9cb37a-metrics-certs") pod "network-metrics-daemon-2lrq9" (UID: "6e98b17f-4794-44aa-8756-58a9bd9cb37a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:38.412236 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:38.412215 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 03:57:38.412320 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:38.412245 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 03:57:38.412320 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:38.412259 2578 projected.go:194] Error preparing data for projected volume kube-api-access-8jhrv for pod openshift-network-diagnostics/network-check-target-n6d2m: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:38.412426 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:38.412334 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b5d6aff7-e2e1-4646-9d82-8c931e49196e-kube-api-access-8jhrv podName:b5d6aff7-e2e1-4646-9d82-8c931e49196e nodeName:}" failed. No retries permitted until 2026-04-21 03:57:39.41231799 +0000 UTC m=+3.081757058 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-8jhrv" (UniqueName: "kubernetes.io/projected/b5d6aff7-e2e1-4646-9d82-8c931e49196e-kube-api-access-8jhrv") pod "network-check-target-n6d2m" (UID: "b5d6aff7-e2e1-4646-9d82-8c931e49196e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:38.746303 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.746041 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 03:57:38.843139 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.843058 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 03:52:37 +0000 UTC" deadline="2027-09-29 23:04:18.495568787 +0000 UTC" Apr 21 03:57:38.843139 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.843087 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12643h6m39.652485816s" Apr 21 03:57:38.944321 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.944237 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-136.ec2.internal" event={"ID":"7d94c65271f325dffc877a062645ae79","Type":"ContainerStarted","Data":"49c3d9626c415aacb71de1b5baa232f250d3f3d2e4d69452baa20da1783e0b58"} Apr 21 03:57:38.947945 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.947908 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-136.ec2.internal" event={"ID":"68f04465e53eef24d16ecd9de5ad5a12","Type":"ContainerStarted","Data":"d4faf730d5383873805ec51aaec8fe9a6f4f227675a4086a4a7537896e9eb88a"} Apr 21 03:57:38.954112 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.954048 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-x9r2c" event={"ID":"6de91b0f-af7b-43a5-8a43-cee7c5ba996e","Type":"ContainerStarted","Data":"8995406e13c69a7b897ea6ae7c88aaf030dc87bfdf99b64e427af0e9d0208bb6"} Apr 21 03:57:38.959881 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.959851 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648s8" event={"ID":"a2aaf4f5-f0bd-4a7f-911d-4030333f4bb7","Type":"ContainerStarted","Data":"2eb8b4d1eb7137cb9284ad4b48a7bec039ff876cd1cb9d2923bf6f7b5406aee4"} Apr 21 03:57:38.963965 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.963920 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kmxxq" event={"ID":"cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b","Type":"ContainerStarted","Data":"d524d0e8c69c5508730ece263a9e6876c559b644b684033ac5828d9336686619"} Apr 21 03:57:38.979587 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.979481 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-7vx69" event={"ID":"adbc5b28-9686-4704-b170-0ada296e15b8","Type":"ContainerStarted","Data":"6ab9b4050c2d732ab7cda88085784c02efd57a59920f7192e76ab1cb5cc8a157"} Apr 21 03:57:38.981925 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.981843 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-7j4lb" event={"ID":"b4a64573-596e-4e34-a0d2-ec31f17a6ba5","Type":"ContainerStarted","Data":"6fa5e0b2deaee247b45ba0ad23fe71fb4f6b7498d45c917d63b5d1f35b11bafa"} Apr 21 03:57:38.997096 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:38.997011 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-j6rnj" event={"ID":"7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6","Type":"ContainerStarted","Data":"d19df4999377385f5e5a2bb3a0c46e0f17799c14b7c73ecb94c9aba61e7a5272"} Apr 21 03:57:39.000077 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:39.000046 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jpnv7" event={"ID":"30292a27-0318-4f49-b26c-f54654ac07db","Type":"ContainerStarted","Data":"596801bebe6aebb916d4052db708db08d9453bac9147f542f26f913cd2aae12b"} Apr 21 03:57:39.003529 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:39.003501 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jxqfz" event={"ID":"f4acd3fa-d747-4053-9af8-c38066e122ab","Type":"ContainerStarted","Data":"e0b734a89e9f2cddaf65a1c56fcbced70f7b306c7d002fdcd605da8e257e26f1"} Apr 21 03:57:39.010087 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:39.010042 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-82wml" event={"ID":"37b63820-2bf0-4a5b-82c5-6b56ab7689b7","Type":"ContainerStarted","Data":"085ad2311f3e4bb54c5b6bb791b9f129feff6bc041713c8fe39d2e841748bcf4"} Apr 21 03:57:39.307796 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:39.307569 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 03:57:39.420045 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:39.419224 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6e98b17f-4794-44aa-8756-58a9bd9cb37a-metrics-certs\") pod \"network-metrics-daemon-2lrq9\" (UID: \"6e98b17f-4794-44aa-8756-58a9bd9cb37a\") " pod="openshift-multus/network-metrics-daemon-2lrq9" Apr 21 03:57:39.420045 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:39.419312 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8jhrv\" (UniqueName: \"kubernetes.io/projected/b5d6aff7-e2e1-4646-9d82-8c931e49196e-kube-api-access-8jhrv\") pod \"network-check-target-n6d2m\" (UID: \"b5d6aff7-e2e1-4646-9d82-8c931e49196e\") " pod="openshift-network-diagnostics/network-check-target-n6d2m" Apr 21 03:57:39.420045 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:39.419468 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 03:57:39.420045 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:39.419488 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 03:57:39.420045 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:39.419501 2578 projected.go:194] Error preparing data for projected volume kube-api-access-8jhrv for pod openshift-network-diagnostics/network-check-target-n6d2m: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:39.420045 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:39.419559 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b5d6aff7-e2e1-4646-9d82-8c931e49196e-kube-api-access-8jhrv podName:b5d6aff7-e2e1-4646-9d82-8c931e49196e nodeName:}" failed. No retries permitted until 2026-04-21 03:57:41.419541329 +0000 UTC m=+5.088980385 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-8jhrv" (UniqueName: "kubernetes.io/projected/b5d6aff7-e2e1-4646-9d82-8c931e49196e-kube-api-access-8jhrv") pod "network-check-target-n6d2m" (UID: "b5d6aff7-e2e1-4646-9d82-8c931e49196e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:39.420045 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:39.419968 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:39.420045 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:39.420016 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e98b17f-4794-44aa-8756-58a9bd9cb37a-metrics-certs podName:6e98b17f-4794-44aa-8756-58a9bd9cb37a nodeName:}" failed. No retries permitted until 2026-04-21 03:57:41.420001153 +0000 UTC m=+5.089440210 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6e98b17f-4794-44aa-8756-58a9bd9cb37a-metrics-certs") pod "network-metrics-daemon-2lrq9" (UID: "6e98b17f-4794-44aa-8756-58a9bd9cb37a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:39.843891 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:39.843847 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 03:52:37 +0000 UTC" deadline="2027-11-04 14:02:23.834224282 +0000 UTC" Apr 21 03:57:39.843891 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:39.843885 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13498h4m43.990342853s" Apr 21 03:57:39.924786 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:39.923999 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2lrq9" Apr 21 03:57:39.924786 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:39.924144 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2lrq9" podUID="6e98b17f-4794-44aa-8756-58a9bd9cb37a" Apr 21 03:57:39.924786 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:39.924253 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6d2m" Apr 21 03:57:39.924786 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:39.924338 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n6d2m" podUID="b5d6aff7-e2e1-4646-9d82-8c931e49196e" Apr 21 03:57:41.435158 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:41.435123 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8jhrv\" (UniqueName: \"kubernetes.io/projected/b5d6aff7-e2e1-4646-9d82-8c931e49196e-kube-api-access-8jhrv\") pod \"network-check-target-n6d2m\" (UID: \"b5d6aff7-e2e1-4646-9d82-8c931e49196e\") " pod="openshift-network-diagnostics/network-check-target-n6d2m" Apr 21 03:57:41.435647 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:41.435198 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6e98b17f-4794-44aa-8756-58a9bd9cb37a-metrics-certs\") pod \"network-metrics-daemon-2lrq9\" (UID: \"6e98b17f-4794-44aa-8756-58a9bd9cb37a\") " pod="openshift-multus/network-metrics-daemon-2lrq9" Apr 21 03:57:41.435647 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:41.435353 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:41.435647 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:41.435413 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e98b17f-4794-44aa-8756-58a9bd9cb37a-metrics-certs podName:6e98b17f-4794-44aa-8756-58a9bd9cb37a nodeName:}" failed. No retries permitted until 2026-04-21 03:57:45.435393392 +0000 UTC m=+9.104832449 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6e98b17f-4794-44aa-8756-58a9bd9cb37a-metrics-certs") pod "network-metrics-daemon-2lrq9" (UID: "6e98b17f-4794-44aa-8756-58a9bd9cb37a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:41.435957 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:41.435873 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 03:57:41.435957 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:41.435895 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 03:57:41.435957 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:41.435907 2578 projected.go:194] Error preparing data for projected volume kube-api-access-8jhrv for pod openshift-network-diagnostics/network-check-target-n6d2m: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:41.435957 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:41.435955 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b5d6aff7-e2e1-4646-9d82-8c931e49196e-kube-api-access-8jhrv podName:b5d6aff7-e2e1-4646-9d82-8c931e49196e nodeName:}" failed. No retries permitted until 2026-04-21 03:57:45.435939736 +0000 UTC m=+9.105378808 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-8jhrv" (UniqueName: "kubernetes.io/projected/b5d6aff7-e2e1-4646-9d82-8c931e49196e-kube-api-access-8jhrv") pod "network-check-target-n6d2m" (UID: "b5d6aff7-e2e1-4646-9d82-8c931e49196e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:41.924242 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:41.924213 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2lrq9" Apr 21 03:57:41.924412 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:41.924370 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2lrq9" podUID="6e98b17f-4794-44aa-8756-58a9bd9cb37a" Apr 21 03:57:41.924507 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:41.924489 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6d2m" Apr 21 03:57:41.924623 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:41.924573 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n6d2m" podUID="b5d6aff7-e2e1-4646-9d82-8c931e49196e" Apr 21 03:57:43.924358 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:43.924322 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2lrq9" Apr 21 03:57:43.924898 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:43.924322 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6d2m" Apr 21 03:57:43.924898 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:43.924464 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2lrq9" podUID="6e98b17f-4794-44aa-8756-58a9bd9cb37a" Apr 21 03:57:43.924898 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:43.924570 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n6d2m" podUID="b5d6aff7-e2e1-4646-9d82-8c931e49196e" Apr 21 03:57:45.467561 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:45.467519 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6e98b17f-4794-44aa-8756-58a9bd9cb37a-metrics-certs\") pod \"network-metrics-daemon-2lrq9\" (UID: \"6e98b17f-4794-44aa-8756-58a9bd9cb37a\") " pod="openshift-multus/network-metrics-daemon-2lrq9" Apr 21 03:57:45.468003 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:45.467603 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8jhrv\" (UniqueName: \"kubernetes.io/projected/b5d6aff7-e2e1-4646-9d82-8c931e49196e-kube-api-access-8jhrv\") pod \"network-check-target-n6d2m\" (UID: \"b5d6aff7-e2e1-4646-9d82-8c931e49196e\") " pod="openshift-network-diagnostics/network-check-target-n6d2m" Apr 21 03:57:45.468003 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:45.467765 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 03:57:45.468003 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:45.467785 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 03:57:45.468003 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:45.467797 2578 projected.go:194] Error preparing data for projected volume kube-api-access-8jhrv for pod openshift-network-diagnostics/network-check-target-n6d2m: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:45.468003 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:45.467824 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:45.468003 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:45.467855 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b5d6aff7-e2e1-4646-9d82-8c931e49196e-kube-api-access-8jhrv podName:b5d6aff7-e2e1-4646-9d82-8c931e49196e nodeName:}" failed. No retries permitted until 2026-04-21 03:57:53.467836076 +0000 UTC m=+17.137275146 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-8jhrv" (UniqueName: "kubernetes.io/projected/b5d6aff7-e2e1-4646-9d82-8c931e49196e-kube-api-access-8jhrv") pod "network-check-target-n6d2m" (UID: "b5d6aff7-e2e1-4646-9d82-8c931e49196e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:45.468003 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:45.467873 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e98b17f-4794-44aa-8756-58a9bd9cb37a-metrics-certs podName:6e98b17f-4794-44aa-8756-58a9bd9cb37a nodeName:}" failed. No retries permitted until 2026-04-21 03:57:53.467864355 +0000 UTC m=+17.137303408 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6e98b17f-4794-44aa-8756-58a9bd9cb37a-metrics-certs") pod "network-metrics-daemon-2lrq9" (UID: "6e98b17f-4794-44aa-8756-58a9bd9cb37a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:45.924310 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:45.924263 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6d2m" Apr 21 03:57:45.924485 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:45.924267 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2lrq9" Apr 21 03:57:45.924485 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:45.924412 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n6d2m" podUID="b5d6aff7-e2e1-4646-9d82-8c931e49196e" Apr 21 03:57:45.924607 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:45.924526 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2lrq9" podUID="6e98b17f-4794-44aa-8756-58a9bd9cb37a" Apr 21 03:57:47.924150 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:47.924110 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6d2m" Apr 21 03:57:47.924531 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:47.924110 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2lrq9" Apr 21 03:57:47.924531 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:47.924218 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n6d2m" podUID="b5d6aff7-e2e1-4646-9d82-8c931e49196e" Apr 21 03:57:47.924531 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:47.924343 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2lrq9" podUID="6e98b17f-4794-44aa-8756-58a9bd9cb37a" Apr 21 03:57:49.924960 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:49.924923 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6d2m" Apr 21 03:57:49.925516 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:49.924935 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2lrq9" Apr 21 03:57:49.925516 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:49.925044 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n6d2m" podUID="b5d6aff7-e2e1-4646-9d82-8c931e49196e" Apr 21 03:57:49.925516 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:49.925152 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2lrq9" podUID="6e98b17f-4794-44aa-8756-58a9bd9cb37a" Apr 21 03:57:51.924696 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:51.924666 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6d2m" Apr 21 03:57:51.925080 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:51.924665 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2lrq9" Apr 21 03:57:51.925080 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:51.924782 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n6d2m" podUID="b5d6aff7-e2e1-4646-9d82-8c931e49196e" Apr 21 03:57:51.925080 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:51.924859 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2lrq9" podUID="6e98b17f-4794-44aa-8756-58a9bd9cb37a" Apr 21 03:57:53.524665 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:53.524631 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6e98b17f-4794-44aa-8756-58a9bd9cb37a-metrics-certs\") pod \"network-metrics-daemon-2lrq9\" (UID: \"6e98b17f-4794-44aa-8756-58a9bd9cb37a\") " pod="openshift-multus/network-metrics-daemon-2lrq9" Apr 21 03:57:53.525105 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:53.524686 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8jhrv\" (UniqueName: \"kubernetes.io/projected/b5d6aff7-e2e1-4646-9d82-8c931e49196e-kube-api-access-8jhrv\") pod \"network-check-target-n6d2m\" (UID: \"b5d6aff7-e2e1-4646-9d82-8c931e49196e\") " pod="openshift-network-diagnostics/network-check-target-n6d2m" Apr 21 03:57:53.525105 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:53.524801 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:53.525105 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:53.524814 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 03:57:53.525105 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:53.524828 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 03:57:53.525105 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:53.524837 2578 projected.go:194] Error preparing data for projected volume kube-api-access-8jhrv for pod openshift-network-diagnostics/network-check-target-n6d2m: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:53.525105 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:53.524866 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e98b17f-4794-44aa-8756-58a9bd9cb37a-metrics-certs podName:6e98b17f-4794-44aa-8756-58a9bd9cb37a nodeName:}" failed. No retries permitted until 2026-04-21 03:58:09.524850852 +0000 UTC m=+33.194289909 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6e98b17f-4794-44aa-8756-58a9bd9cb37a-metrics-certs") pod "network-metrics-daemon-2lrq9" (UID: "6e98b17f-4794-44aa-8756-58a9bd9cb37a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:53.525105 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:53.524880 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b5d6aff7-e2e1-4646-9d82-8c931e49196e-kube-api-access-8jhrv podName:b5d6aff7-e2e1-4646-9d82-8c931e49196e nodeName:}" failed. No retries permitted until 2026-04-21 03:58:09.524874069 +0000 UTC m=+33.194313127 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-8jhrv" (UniqueName: "kubernetes.io/projected/b5d6aff7-e2e1-4646-9d82-8c931e49196e-kube-api-access-8jhrv") pod "network-check-target-n6d2m" (UID: "b5d6aff7-e2e1-4646-9d82-8c931e49196e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:53.924567 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:53.924496 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6d2m" Apr 21 03:57:53.924749 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:53.924500 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2lrq9" Apr 21 03:57:53.924749 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:53.924597 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n6d2m" podUID="b5d6aff7-e2e1-4646-9d82-8c931e49196e" Apr 21 03:57:53.924749 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:53.924680 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2lrq9" podUID="6e98b17f-4794-44aa-8756-58a9bd9cb37a" Apr 21 03:57:55.924735 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:55.924704 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6d2m" Apr 21 03:57:55.925118 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:55.924704 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2lrq9" Apr 21 03:57:55.925118 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:55.924809 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n6d2m" podUID="b5d6aff7-e2e1-4646-9d82-8c931e49196e" Apr 21 03:57:55.925118 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:55.924897 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2lrq9" podUID="6e98b17f-4794-44aa-8756-58a9bd9cb37a" Apr 21 03:57:57.050701 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:57.050218 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-x9r2c" event={"ID":"6de91b0f-af7b-43a5-8a43-cee7c5ba996e","Type":"ContainerStarted","Data":"ab5b809ba1cd1ff07c173033ba7e33977b3116786fc6e713bc4d04cf84906967"} Apr 21 03:57:57.052123 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:57.052061 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-j6rnj" event={"ID":"7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6","Type":"ContainerStarted","Data":"9cb74704d78e7fb6ca39cd2ba38f7f22c2c1c507d00f8aab8fc8fb9333f12e7c"} Apr 21 03:57:57.059397 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:57.059368 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-82wml_37b63820-2bf0-4a5b-82c5-6b56ab7689b7/ovn-acl-logging/0.log" Apr 21 03:57:57.059913 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:57.059880 2578 generic.go:358] "Generic (PLEG): container finished" podID="37b63820-2bf0-4a5b-82c5-6b56ab7689b7" containerID="df8604e4197beecc89f91febd3e019e34ea62fb831e195b18dbc062e817cbda9" exitCode=1 Apr 21 03:57:57.060048 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:57.059943 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-82wml" event={"ID":"37b63820-2bf0-4a5b-82c5-6b56ab7689b7","Type":"ContainerStarted","Data":"697e504f704603de2a6461f1a4c2a9ff1df0bea275cb326f77c6484135940abc"} Apr 21 03:57:57.060048 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:57.059974 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-82wml" event={"ID":"37b63820-2bf0-4a5b-82c5-6b56ab7689b7","Type":"ContainerStarted","Data":"5783b115dac13e3479852a8844bc4c0f2dbc629eae7f612879ef691a040025a1"} Apr 21 03:57:57.060048 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:57.059991 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-82wml" event={"ID":"37b63820-2bf0-4a5b-82c5-6b56ab7689b7","Type":"ContainerStarted","Data":"ee9ae954d258cd7a8bb9c8a8faa8d977b607bc2fa4b1687647a1909c733b9223"} Apr 21 03:57:57.060048 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:57.060005 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-82wml" event={"ID":"37b63820-2bf0-4a5b-82c5-6b56ab7689b7","Type":"ContainerStarted","Data":"ca740b4875c8bd845b9463e22bd25ac30142f1230720b9865d0961847336e657"} Apr 21 03:57:57.060048 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:57.060018 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-82wml" event={"ID":"37b63820-2bf0-4a5b-82c5-6b56ab7689b7","Type":"ContainerDied","Data":"df8604e4197beecc89f91febd3e019e34ea62fb831e195b18dbc062e817cbda9"} Apr 21 03:57:57.060323 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:57.060057 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-82wml" event={"ID":"37b63820-2bf0-4a5b-82c5-6b56ab7689b7","Type":"ContainerStarted","Data":"658f6418644cf0176cf7990e1c0a4cec49d6171cef473cfa7867e1a41091e9b0"} Apr 21 03:57:57.062236 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:57.062215 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-136.ec2.internal" event={"ID":"68f04465e53eef24d16ecd9de5ad5a12","Type":"ContainerStarted","Data":"f2d7c18df3c5be887190b314b2188e3919effdff73068eb2613ca93eb0c2a929"} Apr 21 03:57:57.068860 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:57.068805 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-x9r2c" podStartSLOduration=2.069354846 podStartE2EDuration="20.068792825s" podCreationTimestamp="2026-04-21 03:57:37 +0000 UTC" firstStartedPulling="2026-04-21 03:57:38.243704457 +0000 UTC m=+1.913143510" lastFinishedPulling="2026-04-21 03:57:56.243142419 +0000 UTC m=+19.912581489" observedRunningTime="2026-04-21 03:57:57.068588258 +0000 UTC m=+20.738027332" watchObservedRunningTime="2026-04-21 03:57:57.068792825 +0000 UTC m=+20.738231900" Apr 21 03:57:57.083183 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:57.083135 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-136.ec2.internal" podStartSLOduration=20.083119217 podStartE2EDuration="20.083119217s" podCreationTimestamp="2026-04-21 03:57:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 03:57:57.082672063 +0000 UTC m=+20.752111139" watchObservedRunningTime="2026-04-21 03:57:57.083119217 +0000 UTC m=+20.752558291" Apr 21 03:57:57.099342 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:57.099266 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-j6rnj" podStartSLOduration=2.7444898479999997 podStartE2EDuration="21.099243418s" podCreationTimestamp="2026-04-21 03:57:36 +0000 UTC" firstStartedPulling="2026-04-21 03:57:38.134881859 +0000 UTC m=+1.804320913" lastFinishedPulling="2026-04-21 03:57:56.48963543 +0000 UTC m=+20.159074483" observedRunningTime="2026-04-21 03:57:57.098850818 +0000 UTC m=+20.768289915" watchObservedRunningTime="2026-04-21 03:57:57.099243418 +0000 UTC m=+20.768682494" Apr 21 03:57:57.756651 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:57.756454 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-t75ld"] Apr 21 03:57:57.759403 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:57.759376 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t75ld" Apr 21 03:57:57.759525 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:57.759459 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t75ld" podUID="ee91e745-0c10-4d73-b8fd-a96e27ea14b8" Apr 21 03:57:57.854734 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:57.854691 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ee91e745-0c10-4d73-b8fd-a96e27ea14b8-kubelet-config\") pod \"global-pull-secret-syncer-t75ld\" (UID: \"ee91e745-0c10-4d73-b8fd-a96e27ea14b8\") " pod="kube-system/global-pull-secret-syncer-t75ld" Apr 21 03:57:57.854734 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:57.854741 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ee91e745-0c10-4d73-b8fd-a96e27ea14b8-original-pull-secret\") pod \"global-pull-secret-syncer-t75ld\" (UID: \"ee91e745-0c10-4d73-b8fd-a96e27ea14b8\") " pod="kube-system/global-pull-secret-syncer-t75ld" Apr 21 03:57:57.854926 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:57.854778 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ee91e745-0c10-4d73-b8fd-a96e27ea14b8-dbus\") pod \"global-pull-secret-syncer-t75ld\" (UID: \"ee91e745-0c10-4d73-b8fd-a96e27ea14b8\") " pod="kube-system/global-pull-secret-syncer-t75ld" Apr 21 03:57:57.924398 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:57.924369 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6d2m" Apr 21 03:57:57.924550 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:57.924369 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2lrq9" Apr 21 03:57:57.924550 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:57.924472 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n6d2m" podUID="b5d6aff7-e2e1-4646-9d82-8c931e49196e" Apr 21 03:57:57.924628 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:57.924579 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2lrq9" podUID="6e98b17f-4794-44aa-8756-58a9bd9cb37a" Apr 21 03:57:57.955842 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:57.955813 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ee91e745-0c10-4d73-b8fd-a96e27ea14b8-kubelet-config\") pod \"global-pull-secret-syncer-t75ld\" (UID: \"ee91e745-0c10-4d73-b8fd-a96e27ea14b8\") " pod="kube-system/global-pull-secret-syncer-t75ld" Apr 21 03:57:57.955986 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:57.955856 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ee91e745-0c10-4d73-b8fd-a96e27ea14b8-original-pull-secret\") pod \"global-pull-secret-syncer-t75ld\" (UID: \"ee91e745-0c10-4d73-b8fd-a96e27ea14b8\") " pod="kube-system/global-pull-secret-syncer-t75ld" Apr 21 03:57:57.955986 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:57.955938 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ee91e745-0c10-4d73-b8fd-a96e27ea14b8-kubelet-config\") pod \"global-pull-secret-syncer-t75ld\" (UID: \"ee91e745-0c10-4d73-b8fd-a96e27ea14b8\") " pod="kube-system/global-pull-secret-syncer-t75ld" Apr 21 03:57:57.955986 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:57.955949 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 03:57:57.956075 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:57.956021 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee91e745-0c10-4d73-b8fd-a96e27ea14b8-original-pull-secret podName:ee91e745-0c10-4d73-b8fd-a96e27ea14b8 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:58.456000533 +0000 UTC m=+22.125439586 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ee91e745-0c10-4d73-b8fd-a96e27ea14b8-original-pull-secret") pod "global-pull-secret-syncer-t75ld" (UID: "ee91e745-0c10-4d73-b8fd-a96e27ea14b8") : object "kube-system"/"original-pull-secret" not registered Apr 21 03:57:57.956075 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:57.956053 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ee91e745-0c10-4d73-b8fd-a96e27ea14b8-dbus\") pod \"global-pull-secret-syncer-t75ld\" (UID: \"ee91e745-0c10-4d73-b8fd-a96e27ea14b8\") " pod="kube-system/global-pull-secret-syncer-t75ld" Apr 21 03:57:57.956248 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:57.956221 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ee91e745-0c10-4d73-b8fd-a96e27ea14b8-dbus\") pod \"global-pull-secret-syncer-t75ld\" (UID: \"ee91e745-0c10-4d73-b8fd-a96e27ea14b8\") " pod="kube-system/global-pull-secret-syncer-t75ld" Apr 21 03:57:58.065347 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:58.065315 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-7j4lb" event={"ID":"b4a64573-596e-4e34-a0d2-ec31f17a6ba5","Type":"ContainerStarted","Data":"e7937944b7df619f783b4e19c0bfa981855e63886ffa5415cbf718605cce10a9"} Apr 21 03:57:58.066602 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:58.066575 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jpnv7" event={"ID":"30292a27-0318-4f49-b26c-f54654ac07db","Type":"ContainerStarted","Data":"e127a0f78bf6183403656e1fefabb367e6c3c36003508dab9c329312f878602a"} Apr 21 03:57:58.067851 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:58.067830 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jxqfz" event={"ID":"f4acd3fa-d747-4053-9af8-c38066e122ab","Type":"ContainerStarted","Data":"90d433fe49692f0ecb93c80402ccaf3014fe8b9d781b8d32d4710d83a63a8c88"} Apr 21 03:57:58.069242 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:58.069216 2578 generic.go:358] "Generic (PLEG): container finished" podID="7d94c65271f325dffc877a062645ae79" containerID="c377ecbeac9841409955ec857e3975a4a02e3f8e35edbe6a6ace2d5f9a404b08" exitCode=0 Apr 21 03:57:58.069332 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:58.069312 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-136.ec2.internal" event={"ID":"7d94c65271f325dffc877a062645ae79","Type":"ContainerDied","Data":"c377ecbeac9841409955ec857e3975a4a02e3f8e35edbe6a6ace2d5f9a404b08"} Apr 21 03:57:58.070690 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:58.070668 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648s8" event={"ID":"a2aaf4f5-f0bd-4a7f-911d-4030333f4bb7","Type":"ContainerStarted","Data":"f4a02f7d77a33c176663e845467b5a03707e8e12c22cf85cd4cca73c96579e63"} Apr 21 03:57:58.072195 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:58.072110 2578 generic.go:358] "Generic (PLEG): container finished" podID="cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b" containerID="44cb566f160badaf2ef435d36cb91ff95015666979c2294ad50562bc8d628ae7" exitCode=0 Apr 21 03:57:58.072275 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:58.072197 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kmxxq" event={"ID":"cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b","Type":"ContainerDied","Data":"44cb566f160badaf2ef435d36cb91ff95015666979c2294ad50562bc8d628ae7"} Apr 21 03:57:58.073759 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:58.073722 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-7vx69" event={"ID":"adbc5b28-9686-4704-b170-0ada296e15b8","Type":"ContainerStarted","Data":"b884d4fd2648c1ce62ff87b1444db8417059d2a3c6dbfe444dbc501a14f41320"} Apr 21 03:57:58.085472 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:58.083219 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-7j4lb" podStartSLOduration=2.99264057 podStartE2EDuration="21.083203269s" podCreationTimestamp="2026-04-21 03:57:37 +0000 UTC" firstStartedPulling="2026-04-21 03:57:38.151320954 +0000 UTC m=+1.820760009" lastFinishedPulling="2026-04-21 03:57:56.241883644 +0000 UTC m=+19.911322708" observedRunningTime="2026-04-21 03:57:58.082842533 +0000 UTC m=+21.752281608" watchObservedRunningTime="2026-04-21 03:57:58.083203269 +0000 UTC m=+21.752642345" Apr 21 03:57:58.097826 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:58.097757 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-jxqfz" podStartSLOduration=3.1281906680000002 podStartE2EDuration="21.097740447s" podCreationTimestamp="2026-04-21 03:57:37 +0000 UTC" firstStartedPulling="2026-04-21 03:57:38.249021614 +0000 UTC m=+1.918460667" lastFinishedPulling="2026-04-21 03:57:56.218571387 +0000 UTC m=+19.888010446" observedRunningTime="2026-04-21 03:57:58.097521391 +0000 UTC m=+21.766960460" watchObservedRunningTime="2026-04-21 03:57:58.097740447 +0000 UTC m=+21.767179523" Apr 21 03:57:58.104071 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:58.104046 2578 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 03:57:58.111739 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:58.111436 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-jpnv7" podStartSLOduration=3.122253078 podStartE2EDuration="21.111419609s" podCreationTimestamp="2026-04-21 03:57:37 +0000 UTC" firstStartedPulling="2026-04-21 03:57:38.252674842 +0000 UTC m=+1.922113895" lastFinishedPulling="2026-04-21 03:57:56.24184115 +0000 UTC m=+19.911280426" observedRunningTime="2026-04-21 03:57:58.110870779 +0000 UTC m=+21.780309853" watchObservedRunningTime="2026-04-21 03:57:58.111419609 +0000 UTC m=+21.780858728" Apr 21 03:57:58.138614 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:58.138562 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-7vx69" podStartSLOduration=3.062325502 podStartE2EDuration="21.138548584s" podCreationTimestamp="2026-04-21 03:57:37 +0000 UTC" firstStartedPulling="2026-04-21 03:57:38.165541397 +0000 UTC m=+1.834980450" lastFinishedPulling="2026-04-21 03:57:56.241764479 +0000 UTC m=+19.911203532" observedRunningTime="2026-04-21 03:57:58.124657535 +0000 UTC m=+21.794096611" watchObservedRunningTime="2026-04-21 03:57:58.138548584 +0000 UTC m=+21.807987696" Apr 21 03:57:58.460197 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:58.460158 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ee91e745-0c10-4d73-b8fd-a96e27ea14b8-original-pull-secret\") pod \"global-pull-secret-syncer-t75ld\" (UID: \"ee91e745-0c10-4d73-b8fd-a96e27ea14b8\") " pod="kube-system/global-pull-secret-syncer-t75ld" Apr 21 03:57:58.460413 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:58.460320 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 03:57:58.460413 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:58.460393 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee91e745-0c10-4d73-b8fd-a96e27ea14b8-original-pull-secret podName:ee91e745-0c10-4d73-b8fd-a96e27ea14b8 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:59.460375821 +0000 UTC m=+23.129814879 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ee91e745-0c10-4d73-b8fd-a96e27ea14b8-original-pull-secret") pod "global-pull-secret-syncer-t75ld" (UID: "ee91e745-0c10-4d73-b8fd-a96e27ea14b8") : object "kube-system"/"original-pull-secret" not registered Apr 21 03:57:58.878358 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:58.878228 2578 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T03:57:58.104068042Z","UUID":"e3c4c00b-e2bb-446d-9c03-25c87905c924","Handler":null,"Name":"","Endpoint":""} Apr 21 03:57:58.880135 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:58.880069 2578 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 03:57:58.880135 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:58.880103 2578 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 03:57:59.078831 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:59.078805 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-82wml_37b63820-2bf0-4a5b-82c5-6b56ab7689b7/ovn-acl-logging/0.log" Apr 21 03:57:59.079258 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:59.079224 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-82wml" event={"ID":"37b63820-2bf0-4a5b-82c5-6b56ab7689b7","Type":"ContainerStarted","Data":"bc709f4f47a5a165870c8392708bc28d621590cc494428e4b38a0da4e7149950"} Apr 21 03:57:59.080945 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:59.080916 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-136.ec2.internal" event={"ID":"7d94c65271f325dffc877a062645ae79","Type":"ContainerStarted","Data":"28f62dd6ee17c77a1ecd2fbde019416bd5642df173d19c9d05cc34c5bbaf5364"} Apr 21 03:57:59.083355 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:59.083009 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648s8" event={"ID":"a2aaf4f5-f0bd-4a7f-911d-4030333f4bb7","Type":"ContainerStarted","Data":"3cd30772aadf6bbe2d000f2a0006cc9acac8a785f3be4808d40720c5efcf6d55"} Apr 21 03:57:59.083355 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:59.083041 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648s8" event={"ID":"a2aaf4f5-f0bd-4a7f-911d-4030333f4bb7","Type":"ContainerStarted","Data":"e0e870b75793c0f026d6fc0167c1f0048f6883873805b722c8354e9f86766148"} Apr 21 03:57:59.095379 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:59.095331 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-136.ec2.internal" podStartSLOduration=22.095313192 podStartE2EDuration="22.095313192s" podCreationTimestamp="2026-04-21 03:57:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 03:57:59.094633303 +0000 UTC m=+22.764072381" watchObservedRunningTime="2026-04-21 03:57:59.095313192 +0000 UTC m=+22.764752259" Apr 21 03:57:59.110289 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:59.110185 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-648s8" podStartSLOduration=1.466673649 podStartE2EDuration="22.11017264s" podCreationTimestamp="2026-04-21 03:57:37 +0000 UTC" firstStartedPulling="2026-04-21 03:57:38.215151457 +0000 UTC m=+1.884590509" lastFinishedPulling="2026-04-21 03:57:58.858650429 +0000 UTC m=+22.528089500" observedRunningTime="2026-04-21 03:57:59.109979526 +0000 UTC m=+22.779418615" watchObservedRunningTime="2026-04-21 03:57:59.11017264 +0000 UTC m=+22.779611715" Apr 21 03:57:59.467847 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:59.467683 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ee91e745-0c10-4d73-b8fd-a96e27ea14b8-original-pull-secret\") pod \"global-pull-secret-syncer-t75ld\" (UID: \"ee91e745-0c10-4d73-b8fd-a96e27ea14b8\") " pod="kube-system/global-pull-secret-syncer-t75ld" Apr 21 03:57:59.468012 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:59.467841 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 03:57:59.468012 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:59.467932 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee91e745-0c10-4d73-b8fd-a96e27ea14b8-original-pull-secret podName:ee91e745-0c10-4d73-b8fd-a96e27ea14b8 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:01.467912264 +0000 UTC m=+25.137351332 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ee91e745-0c10-4d73-b8fd-a96e27ea14b8-original-pull-secret") pod "global-pull-secret-syncer-t75ld" (UID: "ee91e745-0c10-4d73-b8fd-a96e27ea14b8") : object "kube-system"/"original-pull-secret" not registered Apr 21 03:57:59.924881 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:59.924848 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6d2m" Apr 21 03:57:59.925057 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:59.924919 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t75ld" Apr 21 03:57:59.925057 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:59.925015 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t75ld" podUID="ee91e745-0c10-4d73-b8fd-a96e27ea14b8" Apr 21 03:57:59.925057 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:57:59.925045 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2lrq9" Apr 21 03:57:59.925227 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:59.925153 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2lrq9" podUID="6e98b17f-4794-44aa-8756-58a9bd9cb37a" Apr 21 03:57:59.925300 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:57:59.925244 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n6d2m" podUID="b5d6aff7-e2e1-4646-9d82-8c931e49196e" Apr 21 03:58:01.291419 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:01.291323 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-7vx69" Apr 21 03:58:01.292210 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:01.292193 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-7vx69" Apr 21 03:58:01.482675 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:01.482641 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ee91e745-0c10-4d73-b8fd-a96e27ea14b8-original-pull-secret\") pod \"global-pull-secret-syncer-t75ld\" (UID: \"ee91e745-0c10-4d73-b8fd-a96e27ea14b8\") " pod="kube-system/global-pull-secret-syncer-t75ld" Apr 21 03:58:01.482858 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:01.482838 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 03:58:01.482923 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:01.482910 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee91e745-0c10-4d73-b8fd-a96e27ea14b8-original-pull-secret podName:ee91e745-0c10-4d73-b8fd-a96e27ea14b8 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:05.482896224 +0000 UTC m=+29.152335277 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ee91e745-0c10-4d73-b8fd-a96e27ea14b8-original-pull-secret") pod "global-pull-secret-syncer-t75ld" (UID: "ee91e745-0c10-4d73-b8fd-a96e27ea14b8") : object "kube-system"/"original-pull-secret" not registered Apr 21 03:58:01.924238 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:01.924186 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6d2m" Apr 21 03:58:01.924411 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:01.924241 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t75ld" Apr 21 03:58:01.924411 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:01.924327 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n6d2m" podUID="b5d6aff7-e2e1-4646-9d82-8c931e49196e" Apr 21 03:58:01.924411 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:01.924366 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t75ld" podUID="ee91e745-0c10-4d73-b8fd-a96e27ea14b8" Apr 21 03:58:01.924411 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:01.924407 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2lrq9" Apr 21 03:58:01.924540 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:01.924500 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2lrq9" podUID="6e98b17f-4794-44aa-8756-58a9bd9cb37a" Apr 21 03:58:02.091355 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:02.091241 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-82wml_37b63820-2bf0-4a5b-82c5-6b56ab7689b7/ovn-acl-logging/0.log" Apr 21 03:58:02.091802 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:02.091619 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-82wml" event={"ID":"37b63820-2bf0-4a5b-82c5-6b56ab7689b7","Type":"ContainerStarted","Data":"1801a29d7755a8ebb57b6ae9142b2150159027bc7b78e1e63fca091bfce5ba18"} Apr 21 03:58:02.092370 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:02.091935 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:58:02.092370 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:02.091963 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:58:02.092370 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:02.091976 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:58:02.092370 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:02.092032 2578 scope.go:117] "RemoveContainer" containerID="df8604e4197beecc89f91febd3e019e34ea62fb831e195b18dbc062e817cbda9" Apr 21 03:58:02.107595 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:02.107495 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:58:02.109142 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:02.109027 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:58:03.095161 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:03.095125 2578 generic.go:358] "Generic (PLEG): container finished" podID="cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b" containerID="d49aa1765f01c5f9c4fbf40f343c89558670eb1b809824c6f49f7068ae3280ea" exitCode=0 Apr 21 03:58:03.095996 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:03.095209 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kmxxq" event={"ID":"cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b","Type":"ContainerDied","Data":"d49aa1765f01c5f9c4fbf40f343c89558670eb1b809824c6f49f7068ae3280ea"} Apr 21 03:58:03.098404 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:03.098349 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-82wml_37b63820-2bf0-4a5b-82c5-6b56ab7689b7/ovn-acl-logging/0.log" Apr 21 03:58:03.098731 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:03.098709 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-82wml" event={"ID":"37b63820-2bf0-4a5b-82c5-6b56ab7689b7","Type":"ContainerStarted","Data":"719631af20bf4ef435d86cc413479412cce623902cfcce61a547712b945ee9ba"} Apr 21 03:58:03.141415 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:03.141367 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-82wml" podStartSLOduration=8.055275671 podStartE2EDuration="26.141354159s" podCreationTimestamp="2026-04-21 03:57:37 +0000 UTC" firstStartedPulling="2026-04-21 03:57:38.198587193 +0000 UTC m=+1.868026246" lastFinishedPulling="2026-04-21 03:57:56.284665681 +0000 UTC m=+19.954104734" observedRunningTime="2026-04-21 03:58:03.141254302 +0000 UTC m=+26.810693377" watchObservedRunningTime="2026-04-21 03:58:03.141354159 +0000 UTC m=+26.810793234" Apr 21 03:58:03.924468 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:03.924178 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t75ld" Apr 21 03:58:03.924468 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:03.924185 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2lrq9" Apr 21 03:58:03.924468 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:03.924462 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t75ld" podUID="ee91e745-0c10-4d73-b8fd-a96e27ea14b8" Apr 21 03:58:03.924730 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:03.924548 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2lrq9" podUID="6e98b17f-4794-44aa-8756-58a9bd9cb37a" Apr 21 03:58:03.924730 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:03.924185 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6d2m" Apr 21 03:58:03.924730 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:03.924623 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n6d2m" podUID="b5d6aff7-e2e1-4646-9d82-8c931e49196e" Apr 21 03:58:03.980844 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:03.980811 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-t75ld"] Apr 21 03:58:03.983744 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:03.983713 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2lrq9"] Apr 21 03:58:03.984243 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:03.984226 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-n6d2m"] Apr 21 03:58:04.102079 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:04.102046 2578 generic.go:358] "Generic (PLEG): container finished" podID="cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b" containerID="97c6bf5157562f2903b71c66e8c6746338991382c8348a7fbdccdc8953e41af6" exitCode=0 Apr 21 03:58:04.102544 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:04.102186 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kmxxq" event={"ID":"cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b","Type":"ContainerDied","Data":"97c6bf5157562f2903b71c66e8c6746338991382c8348a7fbdccdc8953e41af6"} Apr 21 03:58:04.102544 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:04.102211 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t75ld" Apr 21 03:58:04.102544 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:04.102319 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t75ld" podUID="ee91e745-0c10-4d73-b8fd-a96e27ea14b8" Apr 21 03:58:04.102544 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:04.102376 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6d2m" Apr 21 03:58:04.102544 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:04.102448 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n6d2m" podUID="b5d6aff7-e2e1-4646-9d82-8c931e49196e" Apr 21 03:58:04.102544 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:04.102538 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2lrq9" Apr 21 03:58:04.102770 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:04.102644 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2lrq9" podUID="6e98b17f-4794-44aa-8756-58a9bd9cb37a" Apr 21 03:58:05.106055 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:05.106026 2578 generic.go:358] "Generic (PLEG): container finished" podID="cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b" containerID="3cf3de2447a07fc8b78feae35e83cf839cc432e4dea6d7766b16d437c36cf2d7" exitCode=0 Apr 21 03:58:05.106451 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:05.106068 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kmxxq" event={"ID":"cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b","Type":"ContainerDied","Data":"3cf3de2447a07fc8b78feae35e83cf839cc432e4dea6d7766b16d437c36cf2d7"} Apr 21 03:58:05.511900 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:05.511607 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ee91e745-0c10-4d73-b8fd-a96e27ea14b8-original-pull-secret\") pod \"global-pull-secret-syncer-t75ld\" (UID: \"ee91e745-0c10-4d73-b8fd-a96e27ea14b8\") " pod="kube-system/global-pull-secret-syncer-t75ld" Apr 21 03:58:05.511900 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:05.511764 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 03:58:05.511900 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:05.511832 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee91e745-0c10-4d73-b8fd-a96e27ea14b8-original-pull-secret podName:ee91e745-0c10-4d73-b8fd-a96e27ea14b8 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:13.511813371 +0000 UTC m=+37.181252427 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ee91e745-0c10-4d73-b8fd-a96e27ea14b8-original-pull-secret") pod "global-pull-secret-syncer-t75ld" (UID: "ee91e745-0c10-4d73-b8fd-a96e27ea14b8") : object "kube-system"/"original-pull-secret" not registered Apr 21 03:58:05.924883 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:05.924843 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6d2m" Apr 21 03:58:05.924883 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:05.924876 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t75ld" Apr 21 03:58:05.925094 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:05.924963 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2lrq9" Apr 21 03:58:05.925094 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:05.924981 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n6d2m" podUID="b5d6aff7-e2e1-4646-9d82-8c931e49196e" Apr 21 03:58:05.925094 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:05.925071 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2lrq9" podUID="6e98b17f-4794-44aa-8756-58a9bd9cb37a" Apr 21 03:58:05.925202 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:05.925157 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t75ld" podUID="ee91e745-0c10-4d73-b8fd-a96e27ea14b8" Apr 21 03:58:06.488467 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:06.488427 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-7vx69" Apr 21 03:58:06.488978 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:06.488578 2578 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 21 03:58:06.489237 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:06.489155 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-7vx69" Apr 21 03:58:07.924405 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:07.924372 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6d2m" Apr 21 03:58:07.924833 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:07.924372 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t75ld" Apr 21 03:58:07.924833 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:07.924477 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n6d2m" podUID="b5d6aff7-e2e1-4646-9d82-8c931e49196e" Apr 21 03:58:07.924833 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:07.924568 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t75ld" podUID="ee91e745-0c10-4d73-b8fd-a96e27ea14b8" Apr 21 03:58:07.924833 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:07.924372 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2lrq9" Apr 21 03:58:07.924833 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:07.924676 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2lrq9" podUID="6e98b17f-4794-44aa-8756-58a9bd9cb37a" Apr 21 03:58:08.646188 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:08.645927 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-136.ec2.internal" event="NodeReady" Apr 21 03:58:08.646386 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:08.646335 2578 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 03:58:08.699772 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:08.699741 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-lk9pj"] Apr 21 03:58:08.736176 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:08.736149 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-sjcwm"] Apr 21 03:58:08.736374 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:08.736340 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lk9pj" Apr 21 03:58:08.739173 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:08.739151 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 03:58:08.739321 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:08.739231 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 03:58:08.739321 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:08.739234 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-7tx7c\"" Apr 21 03:58:08.765623 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:08.765597 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lk9pj"] Apr 21 03:58:08.765732 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:08.765649 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-sjcwm"] Apr 21 03:58:08.765732 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:08.765664 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-sjcwm" Apr 21 03:58:08.768171 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:08.768152 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 03:58:08.768749 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:08.768594 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 03:58:08.768749 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:08.768620 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 03:58:08.768749 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:08.768596 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-7mknc\"" Apr 21 03:58:08.837187 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:08.837143 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5b9715dc-7dd5-46ea-961a-2f02107a7655-config-volume\") pod \"dns-default-lk9pj\" (UID: \"5b9715dc-7dd5-46ea-961a-2f02107a7655\") " pod="openshift-dns/dns-default-lk9pj" Apr 21 03:58:08.837187 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:08.837190 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9350a7b-5fe0-4e30-8c05-5ee260472029-cert\") pod \"ingress-canary-sjcwm\" (UID: \"e9350a7b-5fe0-4e30-8c05-5ee260472029\") " pod="openshift-ingress-canary/ingress-canary-sjcwm" Apr 21 03:58:08.837449 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:08.837219 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b9715dc-7dd5-46ea-961a-2f02107a7655-metrics-tls\") pod \"dns-default-lk9pj\" (UID: \"5b9715dc-7dd5-46ea-961a-2f02107a7655\") " pod="openshift-dns/dns-default-lk9pj" Apr 21 03:58:08.837449 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:08.837241 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5b9715dc-7dd5-46ea-961a-2f02107a7655-tmp-dir\") pod \"dns-default-lk9pj\" (UID: \"5b9715dc-7dd5-46ea-961a-2f02107a7655\") " pod="openshift-dns/dns-default-lk9pj" Apr 21 03:58:08.837449 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:08.837270 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zf5r\" (UniqueName: \"kubernetes.io/projected/e9350a7b-5fe0-4e30-8c05-5ee260472029-kube-api-access-8zf5r\") pod \"ingress-canary-sjcwm\" (UID: \"e9350a7b-5fe0-4e30-8c05-5ee260472029\") " pod="openshift-ingress-canary/ingress-canary-sjcwm" Apr 21 03:58:08.837449 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:08.837370 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8w2g\" (UniqueName: \"kubernetes.io/projected/5b9715dc-7dd5-46ea-961a-2f02107a7655-kube-api-access-m8w2g\") pod \"dns-default-lk9pj\" (UID: \"5b9715dc-7dd5-46ea-961a-2f02107a7655\") " pod="openshift-dns/dns-default-lk9pj" Apr 21 03:58:08.938539 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:08.938464 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m8w2g\" (UniqueName: \"kubernetes.io/projected/5b9715dc-7dd5-46ea-961a-2f02107a7655-kube-api-access-m8w2g\") pod \"dns-default-lk9pj\" (UID: \"5b9715dc-7dd5-46ea-961a-2f02107a7655\") " pod="openshift-dns/dns-default-lk9pj" Apr 21 03:58:08.938539 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:08.938538 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5b9715dc-7dd5-46ea-961a-2f02107a7655-config-volume\") pod \"dns-default-lk9pj\" (UID: \"5b9715dc-7dd5-46ea-961a-2f02107a7655\") " pod="openshift-dns/dns-default-lk9pj" Apr 21 03:58:08.939017 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:08.938569 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9350a7b-5fe0-4e30-8c05-5ee260472029-cert\") pod \"ingress-canary-sjcwm\" (UID: \"e9350a7b-5fe0-4e30-8c05-5ee260472029\") " pod="openshift-ingress-canary/ingress-canary-sjcwm" Apr 21 03:58:08.939017 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:08.938668 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 03:58:08.939017 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:08.938728 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9350a7b-5fe0-4e30-8c05-5ee260472029-cert podName:e9350a7b-5fe0-4e30-8c05-5ee260472029 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:09.438708075 +0000 UTC m=+33.108147142 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e9350a7b-5fe0-4e30-8c05-5ee260472029-cert") pod "ingress-canary-sjcwm" (UID: "e9350a7b-5fe0-4e30-8c05-5ee260472029") : secret "canary-serving-cert" not found Apr 21 03:58:08.939017 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:08.938890 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b9715dc-7dd5-46ea-961a-2f02107a7655-metrics-tls\") pod \"dns-default-lk9pj\" (UID: \"5b9715dc-7dd5-46ea-961a-2f02107a7655\") " pod="openshift-dns/dns-default-lk9pj" Apr 21 03:58:08.939017 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:08.938929 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5b9715dc-7dd5-46ea-961a-2f02107a7655-tmp-dir\") pod \"dns-default-lk9pj\" (UID: \"5b9715dc-7dd5-46ea-961a-2f02107a7655\") " pod="openshift-dns/dns-default-lk9pj" Apr 21 03:58:08.939017 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:08.938956 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8zf5r\" (UniqueName: \"kubernetes.io/projected/e9350a7b-5fe0-4e30-8c05-5ee260472029-kube-api-access-8zf5r\") pod \"ingress-canary-sjcwm\" (UID: \"e9350a7b-5fe0-4e30-8c05-5ee260472029\") " pod="openshift-ingress-canary/ingress-canary-sjcwm" Apr 21 03:58:08.939017 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:08.938980 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 03:58:08.939366 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:08.939037 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b9715dc-7dd5-46ea-961a-2f02107a7655-metrics-tls podName:5b9715dc-7dd5-46ea-961a-2f02107a7655 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:09.439017508 +0000 UTC m=+33.108456576 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5b9715dc-7dd5-46ea-961a-2f02107a7655-metrics-tls") pod "dns-default-lk9pj" (UID: "5b9715dc-7dd5-46ea-961a-2f02107a7655") : secret "dns-default-metrics-tls" not found Apr 21 03:58:08.939366 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:08.939188 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5b9715dc-7dd5-46ea-961a-2f02107a7655-tmp-dir\") pod \"dns-default-lk9pj\" (UID: \"5b9715dc-7dd5-46ea-961a-2f02107a7655\") " pod="openshift-dns/dns-default-lk9pj" Apr 21 03:58:08.947854 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:08.947831 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5b9715dc-7dd5-46ea-961a-2f02107a7655-config-volume\") pod \"dns-default-lk9pj\" (UID: \"5b9715dc-7dd5-46ea-961a-2f02107a7655\") " pod="openshift-dns/dns-default-lk9pj" Apr 21 03:58:08.949186 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:08.949164 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8w2g\" (UniqueName: \"kubernetes.io/projected/5b9715dc-7dd5-46ea-961a-2f02107a7655-kube-api-access-m8w2g\") pod \"dns-default-lk9pj\" (UID: \"5b9715dc-7dd5-46ea-961a-2f02107a7655\") " pod="openshift-dns/dns-default-lk9pj" Apr 21 03:58:08.949342 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:08.949320 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zf5r\" (UniqueName: \"kubernetes.io/projected/e9350a7b-5fe0-4e30-8c05-5ee260472029-kube-api-access-8zf5r\") pod \"ingress-canary-sjcwm\" (UID: \"e9350a7b-5fe0-4e30-8c05-5ee260472029\") " pod="openshift-ingress-canary/ingress-canary-sjcwm" Apr 21 03:58:09.443727 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:09.443690 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9350a7b-5fe0-4e30-8c05-5ee260472029-cert\") pod \"ingress-canary-sjcwm\" (UID: \"e9350a7b-5fe0-4e30-8c05-5ee260472029\") " pod="openshift-ingress-canary/ingress-canary-sjcwm" Apr 21 03:58:09.443941 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:09.443740 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b9715dc-7dd5-46ea-961a-2f02107a7655-metrics-tls\") pod \"dns-default-lk9pj\" (UID: \"5b9715dc-7dd5-46ea-961a-2f02107a7655\") " pod="openshift-dns/dns-default-lk9pj" Apr 21 03:58:09.443941 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:09.443834 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 03:58:09.443941 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:09.443870 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 03:58:09.443941 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:09.443904 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9350a7b-5fe0-4e30-8c05-5ee260472029-cert podName:e9350a7b-5fe0-4e30-8c05-5ee260472029 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:10.443883083 +0000 UTC m=+34.113322138 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e9350a7b-5fe0-4e30-8c05-5ee260472029-cert") pod "ingress-canary-sjcwm" (UID: "e9350a7b-5fe0-4e30-8c05-5ee260472029") : secret "canary-serving-cert" not found Apr 21 03:58:09.443941 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:09.443923 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b9715dc-7dd5-46ea-961a-2f02107a7655-metrics-tls podName:5b9715dc-7dd5-46ea-961a-2f02107a7655 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:10.443914653 +0000 UTC m=+34.113353706 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5b9715dc-7dd5-46ea-961a-2f02107a7655-metrics-tls") pod "dns-default-lk9pj" (UID: "5b9715dc-7dd5-46ea-961a-2f02107a7655") : secret "dns-default-metrics-tls" not found Apr 21 03:58:09.544973 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:09.544943 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6e98b17f-4794-44aa-8756-58a9bd9cb37a-metrics-certs\") pod \"network-metrics-daemon-2lrq9\" (UID: \"6e98b17f-4794-44aa-8756-58a9bd9cb37a\") " pod="openshift-multus/network-metrics-daemon-2lrq9" Apr 21 03:58:09.545154 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:09.545027 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8jhrv\" (UniqueName: \"kubernetes.io/projected/b5d6aff7-e2e1-4646-9d82-8c931e49196e-kube-api-access-8jhrv\") pod \"network-check-target-n6d2m\" (UID: \"b5d6aff7-e2e1-4646-9d82-8c931e49196e\") " pod="openshift-network-diagnostics/network-check-target-n6d2m" Apr 21 03:58:09.545154 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:09.545102 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:58:09.545154 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:09.545145 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 03:58:09.545318 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:09.545162 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 03:58:09.545318 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:09.545168 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e98b17f-4794-44aa-8756-58a9bd9cb37a-metrics-certs podName:6e98b17f-4794-44aa-8756-58a9bd9cb37a nodeName:}" failed. No retries permitted until 2026-04-21 03:58:41.545152804 +0000 UTC m=+65.214591862 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6e98b17f-4794-44aa-8756-58a9bd9cb37a-metrics-certs") pod "network-metrics-daemon-2lrq9" (UID: "6e98b17f-4794-44aa-8756-58a9bd9cb37a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:58:09.545318 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:09.545178 2578 projected.go:194] Error preparing data for projected volume kube-api-access-8jhrv for pod openshift-network-diagnostics/network-check-target-n6d2m: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:58:09.545318 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:09.545225 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b5d6aff7-e2e1-4646-9d82-8c931e49196e-kube-api-access-8jhrv podName:b5d6aff7-e2e1-4646-9d82-8c931e49196e nodeName:}" failed. No retries permitted until 2026-04-21 03:58:41.545209367 +0000 UTC m=+65.214648435 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-8jhrv" (UniqueName: "kubernetes.io/projected/b5d6aff7-e2e1-4646-9d82-8c931e49196e-kube-api-access-8jhrv") pod "network-check-target-n6d2m" (UID: "b5d6aff7-e2e1-4646-9d82-8c931e49196e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:58:09.924609 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:09.924571 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6d2m" Apr 21 03:58:09.924812 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:09.924571 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2lrq9" Apr 21 03:58:09.924812 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:09.924571 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t75ld" Apr 21 03:58:09.928949 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:09.928915 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 03:58:09.929104 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:09.928962 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 03:58:09.929104 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:09.928969 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 03:58:09.929104 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:09.928926 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-sjhjf\"" Apr 21 03:58:09.929104 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:09.928926 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 03:58:09.929104 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:09.928924 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-k29s5\"" Apr 21 03:58:10.450535 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:10.450501 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9350a7b-5fe0-4e30-8c05-5ee260472029-cert\") pod \"ingress-canary-sjcwm\" (UID: \"e9350a7b-5fe0-4e30-8c05-5ee260472029\") " pod="openshift-ingress-canary/ingress-canary-sjcwm" Apr 21 03:58:10.450535 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:10.450542 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b9715dc-7dd5-46ea-961a-2f02107a7655-metrics-tls\") pod \"dns-default-lk9pj\" (UID: \"5b9715dc-7dd5-46ea-961a-2f02107a7655\") " pod="openshift-dns/dns-default-lk9pj" Apr 21 03:58:10.451021 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:10.450656 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 03:58:10.451021 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:10.450682 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 03:58:10.451021 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:10.450722 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9350a7b-5fe0-4e30-8c05-5ee260472029-cert podName:e9350a7b-5fe0-4e30-8c05-5ee260472029 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:12.450706661 +0000 UTC m=+36.120145713 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e9350a7b-5fe0-4e30-8c05-5ee260472029-cert") pod "ingress-canary-sjcwm" (UID: "e9350a7b-5fe0-4e30-8c05-5ee260472029") : secret "canary-serving-cert" not found Apr 21 03:58:10.451021 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:10.450739 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b9715dc-7dd5-46ea-961a-2f02107a7655-metrics-tls podName:5b9715dc-7dd5-46ea-961a-2f02107a7655 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:12.450730253 +0000 UTC m=+36.120169306 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5b9715dc-7dd5-46ea-961a-2f02107a7655-metrics-tls") pod "dns-default-lk9pj" (UID: "5b9715dc-7dd5-46ea-961a-2f02107a7655") : secret "dns-default-metrics-tls" not found Apr 21 03:58:11.120865 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:11.120833 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kmxxq" event={"ID":"cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b","Type":"ContainerStarted","Data":"c6c6e764c6ba3dde3e51e626492b35d3cf6249e8f89b0bc6d15975506a36927e"} Apr 21 03:58:12.125539 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:12.125502 2578 generic.go:358] "Generic (PLEG): container finished" podID="cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b" containerID="c6c6e764c6ba3dde3e51e626492b35d3cf6249e8f89b0bc6d15975506a36927e" exitCode=0 Apr 21 03:58:12.126103 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:12.125557 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kmxxq" event={"ID":"cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b","Type":"ContainerDied","Data":"c6c6e764c6ba3dde3e51e626492b35d3cf6249e8f89b0bc6d15975506a36927e"} Apr 21 03:58:12.465018 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:12.464944 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9350a7b-5fe0-4e30-8c05-5ee260472029-cert\") pod \"ingress-canary-sjcwm\" (UID: \"e9350a7b-5fe0-4e30-8c05-5ee260472029\") " pod="openshift-ingress-canary/ingress-canary-sjcwm" Apr 21 03:58:12.465018 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:12.464983 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b9715dc-7dd5-46ea-961a-2f02107a7655-metrics-tls\") pod \"dns-default-lk9pj\" (UID: \"5b9715dc-7dd5-46ea-961a-2f02107a7655\") " pod="openshift-dns/dns-default-lk9pj" Apr 21 03:58:12.465181 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:12.465088 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 03:58:12.465181 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:12.465105 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 03:58:12.465181 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:12.465146 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b9715dc-7dd5-46ea-961a-2f02107a7655-metrics-tls podName:5b9715dc-7dd5-46ea-961a-2f02107a7655 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:16.465130267 +0000 UTC m=+40.134569322 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5b9715dc-7dd5-46ea-961a-2f02107a7655-metrics-tls") pod "dns-default-lk9pj" (UID: "5b9715dc-7dd5-46ea-961a-2f02107a7655") : secret "dns-default-metrics-tls" not found Apr 21 03:58:12.465181 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:12.465174 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9350a7b-5fe0-4e30-8c05-5ee260472029-cert podName:e9350a7b-5fe0-4e30-8c05-5ee260472029 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:16.465153032 +0000 UTC m=+40.134592097 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e9350a7b-5fe0-4e30-8c05-5ee260472029-cert") pod "ingress-canary-sjcwm" (UID: "e9350a7b-5fe0-4e30-8c05-5ee260472029") : secret "canary-serving-cert" not found Apr 21 03:58:13.129931 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:13.129898 2578 generic.go:358] "Generic (PLEG): container finished" podID="cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b" containerID="24cd080379a6018f1fc383de7849a8bb476524e91a44953d4ad3119b02ae7408" exitCode=0 Apr 21 03:58:13.130448 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:13.129945 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kmxxq" event={"ID":"cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b","Type":"ContainerDied","Data":"24cd080379a6018f1fc383de7849a8bb476524e91a44953d4ad3119b02ae7408"} Apr 21 03:58:13.572580 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:13.572536 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ee91e745-0c10-4d73-b8fd-a96e27ea14b8-original-pull-secret\") pod \"global-pull-secret-syncer-t75ld\" (UID: \"ee91e745-0c10-4d73-b8fd-a96e27ea14b8\") " pod="kube-system/global-pull-secret-syncer-t75ld" Apr 21 03:58:13.575183 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:13.575163 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ee91e745-0c10-4d73-b8fd-a96e27ea14b8-original-pull-secret\") pod \"global-pull-secret-syncer-t75ld\" (UID: \"ee91e745-0c10-4d73-b8fd-a96e27ea14b8\") " pod="kube-system/global-pull-secret-syncer-t75ld" Apr 21 03:58:13.848135 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:13.848090 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t75ld" Apr 21 03:58:13.990007 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:13.989752 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-t75ld"] Apr 21 03:58:13.994062 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:58:13.994028 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee91e745_0c10_4d73_b8fd_a96e27ea14b8.slice/crio-ce3e6d2ea68aea47bd387db2fc176a7613698521c85d0fb925bba9f9819d96c3 WatchSource:0}: Error finding container ce3e6d2ea68aea47bd387db2fc176a7613698521c85d0fb925bba9f9819d96c3: Status 404 returned error can't find the container with id ce3e6d2ea68aea47bd387db2fc176a7613698521c85d0fb925bba9f9819d96c3 Apr 21 03:58:14.134838 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:14.134726 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kmxxq" event={"ID":"cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b","Type":"ContainerStarted","Data":"f9343b6e607369bd585c0f275201f09430f0c81820e28da4f5f548068b0d336f"} Apr 21 03:58:14.135736 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:14.135710 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-t75ld" event={"ID":"ee91e745-0c10-4d73-b8fd-a96e27ea14b8","Type":"ContainerStarted","Data":"ce3e6d2ea68aea47bd387db2fc176a7613698521c85d0fb925bba9f9819d96c3"} Apr 21 03:58:14.158406 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:14.158232 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-kmxxq" podStartSLOduration=4.482060105 podStartE2EDuration="37.15821707s" podCreationTimestamp="2026-04-21 03:57:37 +0000 UTC" firstStartedPulling="2026-04-21 03:57:38.181377571 +0000 UTC m=+1.850816623" lastFinishedPulling="2026-04-21 03:58:10.85753452 +0000 UTC m=+34.526973588" observedRunningTime="2026-04-21 03:58:14.158149793 +0000 UTC m=+37.827588868" watchObservedRunningTime="2026-04-21 03:58:14.15821707 +0000 UTC m=+37.827656139" Apr 21 03:58:16.493848 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:16.493804 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9350a7b-5fe0-4e30-8c05-5ee260472029-cert\") pod \"ingress-canary-sjcwm\" (UID: \"e9350a7b-5fe0-4e30-8c05-5ee260472029\") " pod="openshift-ingress-canary/ingress-canary-sjcwm" Apr 21 03:58:16.494387 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:16.493862 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b9715dc-7dd5-46ea-961a-2f02107a7655-metrics-tls\") pod \"dns-default-lk9pj\" (UID: \"5b9715dc-7dd5-46ea-961a-2f02107a7655\") " pod="openshift-dns/dns-default-lk9pj" Apr 21 03:58:16.494387 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:16.493932 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 03:58:16.494387 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:16.493993 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 03:58:16.494387 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:16.494007 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9350a7b-5fe0-4e30-8c05-5ee260472029-cert podName:e9350a7b-5fe0-4e30-8c05-5ee260472029 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:24.49398877 +0000 UTC m=+48.163427824 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e9350a7b-5fe0-4e30-8c05-5ee260472029-cert") pod "ingress-canary-sjcwm" (UID: "e9350a7b-5fe0-4e30-8c05-5ee260472029") : secret "canary-serving-cert" not found Apr 21 03:58:16.494387 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:16.494049 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b9715dc-7dd5-46ea-961a-2f02107a7655-metrics-tls podName:5b9715dc-7dd5-46ea-961a-2f02107a7655 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:24.494036367 +0000 UTC m=+48.163475420 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5b9715dc-7dd5-46ea-961a-2f02107a7655-metrics-tls") pod "dns-default-lk9pj" (UID: "5b9715dc-7dd5-46ea-961a-2f02107a7655") : secret "dns-default-metrics-tls" not found Apr 21 03:58:18.147180 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:18.147139 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-t75ld" event={"ID":"ee91e745-0c10-4d73-b8fd-a96e27ea14b8","Type":"ContainerStarted","Data":"47cfdc89995d056eaa0cf2a3f00f3576db943aa659bdfee13ea8e0e9a38b8b70"} Apr 21 03:58:18.161742 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:18.161699 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-t75ld" podStartSLOduration=17.226648424 podStartE2EDuration="21.161684703s" podCreationTimestamp="2026-04-21 03:57:57 +0000 UTC" firstStartedPulling="2026-04-21 03:58:13.995752503 +0000 UTC m=+37.665191556" lastFinishedPulling="2026-04-21 03:58:17.930788783 +0000 UTC m=+41.600227835" observedRunningTime="2026-04-21 03:58:18.161639775 +0000 UTC m=+41.831078850" watchObservedRunningTime="2026-04-21 03:58:18.161684703 +0000 UTC m=+41.831123788" Apr 21 03:58:24.550655 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:24.550610 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9350a7b-5fe0-4e30-8c05-5ee260472029-cert\") pod \"ingress-canary-sjcwm\" (UID: \"e9350a7b-5fe0-4e30-8c05-5ee260472029\") " pod="openshift-ingress-canary/ingress-canary-sjcwm" Apr 21 03:58:24.550655 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:24.550656 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b9715dc-7dd5-46ea-961a-2f02107a7655-metrics-tls\") pod \"dns-default-lk9pj\" (UID: \"5b9715dc-7dd5-46ea-961a-2f02107a7655\") " pod="openshift-dns/dns-default-lk9pj" Apr 21 03:58:24.551090 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:24.550751 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 03:58:24.551090 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:24.550755 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 03:58:24.551090 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:24.550802 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b9715dc-7dd5-46ea-961a-2f02107a7655-metrics-tls podName:5b9715dc-7dd5-46ea-961a-2f02107a7655 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:40.550788256 +0000 UTC m=+64.220227308 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5b9715dc-7dd5-46ea-961a-2f02107a7655-metrics-tls") pod "dns-default-lk9pj" (UID: "5b9715dc-7dd5-46ea-961a-2f02107a7655") : secret "dns-default-metrics-tls" not found Apr 21 03:58:24.551090 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:24.550815 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9350a7b-5fe0-4e30-8c05-5ee260472029-cert podName:e9350a7b-5fe0-4e30-8c05-5ee260472029 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:40.550809494 +0000 UTC m=+64.220248547 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e9350a7b-5fe0-4e30-8c05-5ee260472029-cert") pod "ingress-canary-sjcwm" (UID: "e9350a7b-5fe0-4e30-8c05-5ee260472029") : secret "canary-serving-cert" not found Apr 21 03:58:34.112979 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:34.112944 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-82wml" Apr 21 03:58:40.561725 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:40.561672 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9350a7b-5fe0-4e30-8c05-5ee260472029-cert\") pod \"ingress-canary-sjcwm\" (UID: \"e9350a7b-5fe0-4e30-8c05-5ee260472029\") " pod="openshift-ingress-canary/ingress-canary-sjcwm" Apr 21 03:58:40.561725 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:40.561725 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b9715dc-7dd5-46ea-961a-2f02107a7655-metrics-tls\") pod \"dns-default-lk9pj\" (UID: \"5b9715dc-7dd5-46ea-961a-2f02107a7655\") " pod="openshift-dns/dns-default-lk9pj" Apr 21 03:58:40.562218 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:40.561807 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 03:58:40.562218 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:40.561809 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 03:58:40.562218 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:40.561858 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b9715dc-7dd5-46ea-961a-2f02107a7655-metrics-tls podName:5b9715dc-7dd5-46ea-961a-2f02107a7655 nodeName:}" failed. No retries permitted until 2026-04-21 03:59:12.561844851 +0000 UTC m=+96.231283903 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5b9715dc-7dd5-46ea-961a-2f02107a7655-metrics-tls") pod "dns-default-lk9pj" (UID: "5b9715dc-7dd5-46ea-961a-2f02107a7655") : secret "dns-default-metrics-tls" not found Apr 21 03:58:40.562218 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:40.561903 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9350a7b-5fe0-4e30-8c05-5ee260472029-cert podName:e9350a7b-5fe0-4e30-8c05-5ee260472029 nodeName:}" failed. No retries permitted until 2026-04-21 03:59:12.561889854 +0000 UTC m=+96.231328907 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e9350a7b-5fe0-4e30-8c05-5ee260472029-cert") pod "ingress-canary-sjcwm" (UID: "e9350a7b-5fe0-4e30-8c05-5ee260472029") : secret "canary-serving-cert" not found Apr 21 03:58:41.569885 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:41.569844 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8jhrv\" (UniqueName: \"kubernetes.io/projected/b5d6aff7-e2e1-4646-9d82-8c931e49196e-kube-api-access-8jhrv\") pod \"network-check-target-n6d2m\" (UID: \"b5d6aff7-e2e1-4646-9d82-8c931e49196e\") " pod="openshift-network-diagnostics/network-check-target-n6d2m" Apr 21 03:58:41.570327 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:41.569910 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6e98b17f-4794-44aa-8756-58a9bd9cb37a-metrics-certs\") pod \"network-metrics-daemon-2lrq9\" (UID: \"6e98b17f-4794-44aa-8756-58a9bd9cb37a\") " pod="openshift-multus/network-metrics-daemon-2lrq9" Apr 21 03:58:41.572724 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:41.572690 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 03:58:41.572724 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:41.572693 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 03:58:41.580869 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:41.580849 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 03:58:41.580928 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:58:41.580921 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e98b17f-4794-44aa-8756-58a9bd9cb37a-metrics-certs podName:6e98b17f-4794-44aa-8756-58a9bd9cb37a nodeName:}" failed. No retries permitted until 2026-04-21 03:59:45.58090071 +0000 UTC m=+129.250339767 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6e98b17f-4794-44aa-8756-58a9bd9cb37a-metrics-certs") pod "network-metrics-daemon-2lrq9" (UID: "6e98b17f-4794-44aa-8756-58a9bd9cb37a") : secret "metrics-daemon-secret" not found Apr 21 03:58:41.583170 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:41.583153 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 03:58:41.594416 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:41.594397 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jhrv\" (UniqueName: \"kubernetes.io/projected/b5d6aff7-e2e1-4646-9d82-8c931e49196e-kube-api-access-8jhrv\") pod \"network-check-target-n6d2m\" (UID: \"b5d6aff7-e2e1-4646-9d82-8c931e49196e\") " pod="openshift-network-diagnostics/network-check-target-n6d2m" Apr 21 03:58:41.749905 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:41.749874 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-sjhjf\"" Apr 21 03:58:41.757882 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:41.757858 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6d2m" Apr 21 03:58:41.885761 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:41.885717 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-n6d2m"] Apr 21 03:58:41.889662 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:58:41.889634 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5d6aff7_e2e1_4646_9d82_8c931e49196e.slice/crio-3afbebc2a1984ae2b3a618fc8095c71dd52fd022deaea6ce411450c21681345a WatchSource:0}: Error finding container 3afbebc2a1984ae2b3a618fc8095c71dd52fd022deaea6ce411450c21681345a: Status 404 returned error can't find the container with id 3afbebc2a1984ae2b3a618fc8095c71dd52fd022deaea6ce411450c21681345a Apr 21 03:58:42.188576 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:42.188489 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-n6d2m" event={"ID":"b5d6aff7-e2e1-4646-9d82-8c931e49196e","Type":"ContainerStarted","Data":"3afbebc2a1984ae2b3a618fc8095c71dd52fd022deaea6ce411450c21681345a"} Apr 21 03:58:45.195418 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:45.195381 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-n6d2m" event={"ID":"b5d6aff7-e2e1-4646-9d82-8c931e49196e","Type":"ContainerStarted","Data":"6f4f7e6eaf09db2febd70386082bd23ebc4b5851d22bbe39dfdb921a4b097766"} Apr 21 03:58:45.195824 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:45.195556 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-n6d2m" Apr 21 03:58:45.210397 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:58:45.210347 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-n6d2m" podStartSLOduration=65.521156771 podStartE2EDuration="1m8.210332135s" podCreationTimestamp="2026-04-21 03:57:37 +0000 UTC" firstStartedPulling="2026-04-21 03:58:41.891592655 +0000 UTC m=+65.561031708" lastFinishedPulling="2026-04-21 03:58:44.580768004 +0000 UTC m=+68.250207072" observedRunningTime="2026-04-21 03:58:45.209466268 +0000 UTC m=+68.878905344" watchObservedRunningTime="2026-04-21 03:58:45.210332135 +0000 UTC m=+68.879771209" Apr 21 03:59:12.579333 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:12.579164 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9350a7b-5fe0-4e30-8c05-5ee260472029-cert\") pod \"ingress-canary-sjcwm\" (UID: \"e9350a7b-5fe0-4e30-8c05-5ee260472029\") " pod="openshift-ingress-canary/ingress-canary-sjcwm" Apr 21 03:59:12.579333 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:12.579214 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b9715dc-7dd5-46ea-961a-2f02107a7655-metrics-tls\") pod \"dns-default-lk9pj\" (UID: \"5b9715dc-7dd5-46ea-961a-2f02107a7655\") " pod="openshift-dns/dns-default-lk9pj" Apr 21 03:59:12.579737 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:59:12.579357 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 03:59:12.579737 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:59:12.579368 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 03:59:12.579737 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:59:12.579427 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b9715dc-7dd5-46ea-961a-2f02107a7655-metrics-tls podName:5b9715dc-7dd5-46ea-961a-2f02107a7655 nodeName:}" failed. No retries permitted until 2026-04-21 04:00:16.579408869 +0000 UTC m=+160.248847924 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5b9715dc-7dd5-46ea-961a-2f02107a7655-metrics-tls") pod "dns-default-lk9pj" (UID: "5b9715dc-7dd5-46ea-961a-2f02107a7655") : secret "dns-default-metrics-tls" not found Apr 21 03:59:12.579737 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:59:12.579442 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9350a7b-5fe0-4e30-8c05-5ee260472029-cert podName:e9350a7b-5fe0-4e30-8c05-5ee260472029 nodeName:}" failed. No retries permitted until 2026-04-21 04:00:16.579435887 +0000 UTC m=+160.248874940 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e9350a7b-5fe0-4e30-8c05-5ee260472029-cert") pod "ingress-canary-sjcwm" (UID: "e9350a7b-5fe0-4e30-8c05-5ee260472029") : secret "canary-serving-cert" not found Apr 21 03:59:16.199838 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:16.199809 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-n6d2m" Apr 21 03:59:23.330962 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:23.330924 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-9mdst"] Apr 21 03:59:23.335468 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:23.335429 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-598f8f588b-lr9v8"] Apr 21 03:59:23.335785 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:23.335757 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9mdst" Apr 21 03:59:23.338175 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:23.338150 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 21 03:59:23.338329 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:23.338249 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-598f8f588b-lr9v8" Apr 21 03:59:23.338558 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:23.338497 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 03:59:23.338767 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:23.338750 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 03:59:23.340228 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:23.340214 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-vlbxh\"" Apr 21 03:59:23.340407 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:23.340388 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 21 03:59:23.340477 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:23.340436 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 21 03:59:23.340746 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:23.340732 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 21 03:59:23.340815 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:23.340733 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-dzn5d\"" Apr 21 03:59:23.340815 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:23.340797 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 21 03:59:23.340918 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:23.340820 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 21 03:59:23.341060 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:23.341045 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 21 03:59:23.341100 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:23.341069 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 21 03:59:23.343128 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:23.343104 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-9mdst"] Apr 21 03:59:23.344151 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:23.344130 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-598f8f588b-lr9v8"] Apr 21 03:59:23.455368 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:23.455328 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c8c7e63b-ee59-4ec9-bf86-150c2642daa7-metrics-certs\") pod \"router-default-598f8f588b-lr9v8\" (UID: \"c8c7e63b-ee59-4ec9-bf86-150c2642daa7\") " pod="openshift-ingress/router-default-598f8f588b-lr9v8" Apr 21 03:59:23.455368 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:23.455373 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6ftk\" (UniqueName: \"kubernetes.io/projected/474836e5-bd94-4a4c-a1ec-ee329f804cb6-kube-api-access-n6ftk\") pod \"cluster-monitoring-operator-75587bd455-9mdst\" (UID: \"474836e5-bd94-4a4c-a1ec-ee329f804cb6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9mdst" Apr 21 03:59:23.455629 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:23.455406 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c8c7e63b-ee59-4ec9-bf86-150c2642daa7-default-certificate\") pod \"router-default-598f8f588b-lr9v8\" (UID: \"c8c7e63b-ee59-4ec9-bf86-150c2642daa7\") " pod="openshift-ingress/router-default-598f8f588b-lr9v8" Apr 21 03:59:23.455629 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:23.455426 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/474836e5-bd94-4a4c-a1ec-ee329f804cb6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-9mdst\" (UID: \"474836e5-bd94-4a4c-a1ec-ee329f804cb6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9mdst" Apr 21 03:59:23.455629 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:23.455489 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lchm4\" (UniqueName: \"kubernetes.io/projected/c8c7e63b-ee59-4ec9-bf86-150c2642daa7-kube-api-access-lchm4\") pod \"router-default-598f8f588b-lr9v8\" (UID: \"c8c7e63b-ee59-4ec9-bf86-150c2642daa7\") " pod="openshift-ingress/router-default-598f8f588b-lr9v8" Apr 21 03:59:23.455629 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:23.455527 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/474836e5-bd94-4a4c-a1ec-ee329f804cb6-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-9mdst\" (UID: \"474836e5-bd94-4a4c-a1ec-ee329f804cb6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9mdst" Apr 21 03:59:23.455771 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:23.455663 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8c7e63b-ee59-4ec9-bf86-150c2642daa7-service-ca-bundle\") pod \"router-default-598f8f588b-lr9v8\" (UID: \"c8c7e63b-ee59-4ec9-bf86-150c2642daa7\") " pod="openshift-ingress/router-default-598f8f588b-lr9v8" Apr 21 03:59:23.455771 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:23.455711 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c8c7e63b-ee59-4ec9-bf86-150c2642daa7-stats-auth\") pod \"router-default-598f8f588b-lr9v8\" (UID: \"c8c7e63b-ee59-4ec9-bf86-150c2642daa7\") " pod="openshift-ingress/router-default-598f8f588b-lr9v8" Apr 21 03:59:23.556774 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:23.556725 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c8c7e63b-ee59-4ec9-bf86-150c2642daa7-metrics-certs\") pod \"router-default-598f8f588b-lr9v8\" (UID: \"c8c7e63b-ee59-4ec9-bf86-150c2642daa7\") " pod="openshift-ingress/router-default-598f8f588b-lr9v8" Apr 21 03:59:23.556774 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:23.556780 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n6ftk\" (UniqueName: \"kubernetes.io/projected/474836e5-bd94-4a4c-a1ec-ee329f804cb6-kube-api-access-n6ftk\") pod \"cluster-monitoring-operator-75587bd455-9mdst\" (UID: \"474836e5-bd94-4a4c-a1ec-ee329f804cb6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9mdst" Apr 21 03:59:23.556892 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:23.556802 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c8c7e63b-ee59-4ec9-bf86-150c2642daa7-default-certificate\") pod \"router-default-598f8f588b-lr9v8\" (UID: \"c8c7e63b-ee59-4ec9-bf86-150c2642daa7\") " pod="openshift-ingress/router-default-598f8f588b-lr9v8" Apr 21 03:59:23.556892 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:23.556821 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/474836e5-bd94-4a4c-a1ec-ee329f804cb6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-9mdst\" (UID: \"474836e5-bd94-4a4c-a1ec-ee329f804cb6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9mdst" Apr 21 03:59:23.556892 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:23.556843 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lchm4\" (UniqueName: \"kubernetes.io/projected/c8c7e63b-ee59-4ec9-bf86-150c2642daa7-kube-api-access-lchm4\") pod \"router-default-598f8f588b-lr9v8\" (UID: \"c8c7e63b-ee59-4ec9-bf86-150c2642daa7\") " pod="openshift-ingress/router-default-598f8f588b-lr9v8" Apr 21 03:59:23.556892 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:23.556858 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/474836e5-bd94-4a4c-a1ec-ee329f804cb6-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-9mdst\" (UID: \"474836e5-bd94-4a4c-a1ec-ee329f804cb6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9mdst" Apr 21 03:59:23.556892 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:59:23.556880 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 03:59:23.557073 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:59:23.556964 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8c7e63b-ee59-4ec9-bf86-150c2642daa7-metrics-certs podName:c8c7e63b-ee59-4ec9-bf86-150c2642daa7 nodeName:}" failed. No retries permitted until 2026-04-21 03:59:24.05694302 +0000 UTC m=+107.726382079 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c8c7e63b-ee59-4ec9-bf86-150c2642daa7-metrics-certs") pod "router-default-598f8f588b-lr9v8" (UID: "c8c7e63b-ee59-4ec9-bf86-150c2642daa7") : secret "router-metrics-certs-default" not found Apr 21 03:59:23.557073 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:59:23.556995 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 03:59:23.557073 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:23.557006 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8c7e63b-ee59-4ec9-bf86-150c2642daa7-service-ca-bundle\") pod \"router-default-598f8f588b-lr9v8\" (UID: \"c8c7e63b-ee59-4ec9-bf86-150c2642daa7\") " pod="openshift-ingress/router-default-598f8f588b-lr9v8" Apr 21 03:59:23.557073 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:23.557044 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c8c7e63b-ee59-4ec9-bf86-150c2642daa7-stats-auth\") pod \"router-default-598f8f588b-lr9v8\" (UID: \"c8c7e63b-ee59-4ec9-bf86-150c2642daa7\") " pod="openshift-ingress/router-default-598f8f588b-lr9v8" Apr 21 03:59:23.557073 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:59:23.557064 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/474836e5-bd94-4a4c-a1ec-ee329f804cb6-cluster-monitoring-operator-tls podName:474836e5-bd94-4a4c-a1ec-ee329f804cb6 nodeName:}" failed. No retries permitted until 2026-04-21 03:59:24.057046308 +0000 UTC m=+107.726485374 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/474836e5-bd94-4a4c-a1ec-ee329f804cb6-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-9mdst" (UID: "474836e5-bd94-4a4c-a1ec-ee329f804cb6") : secret "cluster-monitoring-operator-tls" not found Apr 21 03:59:23.557319 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:59:23.557171 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c8c7e63b-ee59-4ec9-bf86-150c2642daa7-service-ca-bundle podName:c8c7e63b-ee59-4ec9-bf86-150c2642daa7 nodeName:}" failed. No retries permitted until 2026-04-21 03:59:24.057153216 +0000 UTC m=+107.726592289 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/c8c7e63b-ee59-4ec9-bf86-150c2642daa7-service-ca-bundle") pod "router-default-598f8f588b-lr9v8" (UID: "c8c7e63b-ee59-4ec9-bf86-150c2642daa7") : configmap references non-existent config key: service-ca.crt Apr 21 03:59:23.557632 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:23.557612 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/474836e5-bd94-4a4c-a1ec-ee329f804cb6-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-9mdst\" (UID: \"474836e5-bd94-4a4c-a1ec-ee329f804cb6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9mdst" Apr 21 03:59:23.559959 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:23.559928 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c8c7e63b-ee59-4ec9-bf86-150c2642daa7-default-certificate\") pod \"router-default-598f8f588b-lr9v8\" (UID: \"c8c7e63b-ee59-4ec9-bf86-150c2642daa7\") " pod="openshift-ingress/router-default-598f8f588b-lr9v8" Apr 21 03:59:23.559959 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:23.559942 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c8c7e63b-ee59-4ec9-bf86-150c2642daa7-stats-auth\") pod \"router-default-598f8f588b-lr9v8\" (UID: \"c8c7e63b-ee59-4ec9-bf86-150c2642daa7\") " pod="openshift-ingress/router-default-598f8f588b-lr9v8" Apr 21 03:59:23.567580 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:23.567556 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6ftk\" (UniqueName: \"kubernetes.io/projected/474836e5-bd94-4a4c-a1ec-ee329f804cb6-kube-api-access-n6ftk\") pod \"cluster-monitoring-operator-75587bd455-9mdst\" (UID: \"474836e5-bd94-4a4c-a1ec-ee329f804cb6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9mdst" Apr 21 03:59:23.567665 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:23.567600 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lchm4\" (UniqueName: \"kubernetes.io/projected/c8c7e63b-ee59-4ec9-bf86-150c2642daa7-kube-api-access-lchm4\") pod \"router-default-598f8f588b-lr9v8\" (UID: \"c8c7e63b-ee59-4ec9-bf86-150c2642daa7\") " pod="openshift-ingress/router-default-598f8f588b-lr9v8" Apr 21 03:59:24.061721 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:24.061676 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/474836e5-bd94-4a4c-a1ec-ee329f804cb6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-9mdst\" (UID: \"474836e5-bd94-4a4c-a1ec-ee329f804cb6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9mdst" Apr 21 03:59:24.061903 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:24.061760 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8c7e63b-ee59-4ec9-bf86-150c2642daa7-service-ca-bundle\") pod \"router-default-598f8f588b-lr9v8\" (UID: \"c8c7e63b-ee59-4ec9-bf86-150c2642daa7\") " pod="openshift-ingress/router-default-598f8f588b-lr9v8" Apr 21 03:59:24.061903 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:24.061796 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c8c7e63b-ee59-4ec9-bf86-150c2642daa7-metrics-certs\") pod \"router-default-598f8f588b-lr9v8\" (UID: \"c8c7e63b-ee59-4ec9-bf86-150c2642daa7\") " pod="openshift-ingress/router-default-598f8f588b-lr9v8" Apr 21 03:59:24.061903 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:59:24.061809 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 03:59:24.061903 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:59:24.061878 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/474836e5-bd94-4a4c-a1ec-ee329f804cb6-cluster-monitoring-operator-tls podName:474836e5-bd94-4a4c-a1ec-ee329f804cb6 nodeName:}" failed. No retries permitted until 2026-04-21 03:59:25.061861768 +0000 UTC m=+108.731300821 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/474836e5-bd94-4a4c-a1ec-ee329f804cb6-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-9mdst" (UID: "474836e5-bd94-4a4c-a1ec-ee329f804cb6") : secret "cluster-monitoring-operator-tls" not found Apr 21 03:59:24.061903 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:59:24.061887 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 03:59:24.062067 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:59:24.061927 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c8c7e63b-ee59-4ec9-bf86-150c2642daa7-service-ca-bundle podName:c8c7e63b-ee59-4ec9-bf86-150c2642daa7 nodeName:}" failed. No retries permitted until 2026-04-21 03:59:25.061911105 +0000 UTC m=+108.731350162 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/c8c7e63b-ee59-4ec9-bf86-150c2642daa7-service-ca-bundle") pod "router-default-598f8f588b-lr9v8" (UID: "c8c7e63b-ee59-4ec9-bf86-150c2642daa7") : configmap references non-existent config key: service-ca.crt Apr 21 03:59:24.062067 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:59:24.061949 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8c7e63b-ee59-4ec9-bf86-150c2642daa7-metrics-certs podName:c8c7e63b-ee59-4ec9-bf86-150c2642daa7 nodeName:}" failed. No retries permitted until 2026-04-21 03:59:25.061937744 +0000 UTC m=+108.731376802 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c8c7e63b-ee59-4ec9-bf86-150c2642daa7-metrics-certs") pod "router-default-598f8f588b-lr9v8" (UID: "c8c7e63b-ee59-4ec9-bf86-150c2642daa7") : secret "router-metrics-certs-default" not found Apr 21 03:59:25.069675 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:25.069630 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8c7e63b-ee59-4ec9-bf86-150c2642daa7-service-ca-bundle\") pod \"router-default-598f8f588b-lr9v8\" (UID: \"c8c7e63b-ee59-4ec9-bf86-150c2642daa7\") " pod="openshift-ingress/router-default-598f8f588b-lr9v8" Apr 21 03:59:25.069675 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:25.069696 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c8c7e63b-ee59-4ec9-bf86-150c2642daa7-metrics-certs\") pod \"router-default-598f8f588b-lr9v8\" (UID: \"c8c7e63b-ee59-4ec9-bf86-150c2642daa7\") " pod="openshift-ingress/router-default-598f8f588b-lr9v8" Apr 21 03:59:25.070115 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:25.069721 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/474836e5-bd94-4a4c-a1ec-ee329f804cb6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-9mdst\" (UID: \"474836e5-bd94-4a4c-a1ec-ee329f804cb6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9mdst" Apr 21 03:59:25.070115 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:59:25.069811 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 03:59:25.070115 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:59:25.069814 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c8c7e63b-ee59-4ec9-bf86-150c2642daa7-service-ca-bundle podName:c8c7e63b-ee59-4ec9-bf86-150c2642daa7 nodeName:}" failed. No retries permitted until 2026-04-21 03:59:27.069791135 +0000 UTC m=+110.739230204 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/c8c7e63b-ee59-4ec9-bf86-150c2642daa7-service-ca-bundle") pod "router-default-598f8f588b-lr9v8" (UID: "c8c7e63b-ee59-4ec9-bf86-150c2642daa7") : configmap references non-existent config key: service-ca.crt Apr 21 03:59:25.070115 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:59:25.069840 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 03:59:25.070115 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:59:25.069850 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/474836e5-bd94-4a4c-a1ec-ee329f804cb6-cluster-monitoring-operator-tls podName:474836e5-bd94-4a4c-a1ec-ee329f804cb6 nodeName:}" failed. No retries permitted until 2026-04-21 03:59:27.069839106 +0000 UTC m=+110.739278159 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/474836e5-bd94-4a4c-a1ec-ee329f804cb6-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-9mdst" (UID: "474836e5-bd94-4a4c-a1ec-ee329f804cb6") : secret "cluster-monitoring-operator-tls" not found Apr 21 03:59:25.070115 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:59:25.069878 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8c7e63b-ee59-4ec9-bf86-150c2642daa7-metrics-certs podName:c8c7e63b-ee59-4ec9-bf86-150c2642daa7 nodeName:}" failed. No retries permitted until 2026-04-21 03:59:27.069864767 +0000 UTC m=+110.739303828 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c8c7e63b-ee59-4ec9-bf86-150c2642daa7-metrics-certs") pod "router-default-598f8f588b-lr9v8" (UID: "c8c7e63b-ee59-4ec9-bf86-150c2642daa7") : secret "router-metrics-certs-default" not found Apr 21 03:59:27.087409 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:27.087364 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c8c7e63b-ee59-4ec9-bf86-150c2642daa7-metrics-certs\") pod \"router-default-598f8f588b-lr9v8\" (UID: \"c8c7e63b-ee59-4ec9-bf86-150c2642daa7\") " pod="openshift-ingress/router-default-598f8f588b-lr9v8" Apr 21 03:59:27.087409 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:27.087411 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/474836e5-bd94-4a4c-a1ec-ee329f804cb6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-9mdst\" (UID: \"474836e5-bd94-4a4c-a1ec-ee329f804cb6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9mdst" Apr 21 03:59:27.087832 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:59:27.087502 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 03:59:27.087832 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:59:27.087507 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 03:59:27.087832 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:59:27.087556 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/474836e5-bd94-4a4c-a1ec-ee329f804cb6-cluster-monitoring-operator-tls podName:474836e5-bd94-4a4c-a1ec-ee329f804cb6 nodeName:}" failed. No retries permitted until 2026-04-21 03:59:31.087542914 +0000 UTC m=+114.756981967 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/474836e5-bd94-4a4c-a1ec-ee329f804cb6-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-9mdst" (UID: "474836e5-bd94-4a4c-a1ec-ee329f804cb6") : secret "cluster-monitoring-operator-tls" not found Apr 21 03:59:27.087832 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:59:27.087588 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8c7e63b-ee59-4ec9-bf86-150c2642daa7-metrics-certs podName:c8c7e63b-ee59-4ec9-bf86-150c2642daa7 nodeName:}" failed. No retries permitted until 2026-04-21 03:59:31.087575105 +0000 UTC m=+114.757014158 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c8c7e63b-ee59-4ec9-bf86-150c2642daa7-metrics-certs") pod "router-default-598f8f588b-lr9v8" (UID: "c8c7e63b-ee59-4ec9-bf86-150c2642daa7") : secret "router-metrics-certs-default" not found Apr 21 03:59:27.087832 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:27.087630 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8c7e63b-ee59-4ec9-bf86-150c2642daa7-service-ca-bundle\") pod \"router-default-598f8f588b-lr9v8\" (UID: \"c8c7e63b-ee59-4ec9-bf86-150c2642daa7\") " pod="openshift-ingress/router-default-598f8f588b-lr9v8" Apr 21 03:59:27.087832 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:59:27.087735 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c8c7e63b-ee59-4ec9-bf86-150c2642daa7-service-ca-bundle podName:c8c7e63b-ee59-4ec9-bf86-150c2642daa7 nodeName:}" failed. No retries permitted until 2026-04-21 03:59:31.087724125 +0000 UTC m=+114.757163178 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/c8c7e63b-ee59-4ec9-bf86-150c2642daa7-service-ca-bundle") pod "router-default-598f8f588b-lr9v8" (UID: "c8c7e63b-ee59-4ec9-bf86-150c2642daa7") : configmap references non-existent config key: service-ca.crt Apr 21 03:59:28.718980 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:28.718951 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-jxqfz_f4acd3fa-d747-4053-9af8-c38066e122ab/dns-node-resolver/0.log" Apr 21 03:59:29.519145 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:29.519118 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-jpnv7_30292a27-0318-4f49-b26c-f54654ac07db/node-ca/0.log" Apr 21 03:59:31.123978 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:31.123939 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c8c7e63b-ee59-4ec9-bf86-150c2642daa7-metrics-certs\") pod \"router-default-598f8f588b-lr9v8\" (UID: \"c8c7e63b-ee59-4ec9-bf86-150c2642daa7\") " pod="openshift-ingress/router-default-598f8f588b-lr9v8" Apr 21 03:59:31.124381 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:31.123988 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/474836e5-bd94-4a4c-a1ec-ee329f804cb6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-9mdst\" (UID: \"474836e5-bd94-4a4c-a1ec-ee329f804cb6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9mdst" Apr 21 03:59:31.124381 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:31.124052 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8c7e63b-ee59-4ec9-bf86-150c2642daa7-service-ca-bundle\") pod \"router-default-598f8f588b-lr9v8\" (UID: \"c8c7e63b-ee59-4ec9-bf86-150c2642daa7\") " pod="openshift-ingress/router-default-598f8f588b-lr9v8" Apr 21 03:59:31.124381 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:59:31.124093 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 03:59:31.124381 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:59:31.124150 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/474836e5-bd94-4a4c-a1ec-ee329f804cb6-cluster-monitoring-operator-tls podName:474836e5-bd94-4a4c-a1ec-ee329f804cb6 nodeName:}" failed. No retries permitted until 2026-04-21 03:59:39.124134382 +0000 UTC m=+122.793573440 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/474836e5-bd94-4a4c-a1ec-ee329f804cb6-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-9mdst" (UID: "474836e5-bd94-4a4c-a1ec-ee329f804cb6") : secret "cluster-monitoring-operator-tls" not found Apr 21 03:59:31.124381 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:59:31.124164 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 03:59:31.124381 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:59:31.124174 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c8c7e63b-ee59-4ec9-bf86-150c2642daa7-service-ca-bundle podName:c8c7e63b-ee59-4ec9-bf86-150c2642daa7 nodeName:}" failed. No retries permitted until 2026-04-21 03:59:39.12415943 +0000 UTC m=+122.793598488 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/c8c7e63b-ee59-4ec9-bf86-150c2642daa7-service-ca-bundle") pod "router-default-598f8f588b-lr9v8" (UID: "c8c7e63b-ee59-4ec9-bf86-150c2642daa7") : configmap references non-existent config key: service-ca.crt Apr 21 03:59:31.124381 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:59:31.124225 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8c7e63b-ee59-4ec9-bf86-150c2642daa7-metrics-certs podName:c8c7e63b-ee59-4ec9-bf86-150c2642daa7 nodeName:}" failed. No retries permitted until 2026-04-21 03:59:39.124213808 +0000 UTC m=+122.793652861 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c8c7e63b-ee59-4ec9-bf86-150c2642daa7-metrics-certs") pod "router-default-598f8f588b-lr9v8" (UID: "c8c7e63b-ee59-4ec9-bf86-150c2642daa7") : secret "router-metrics-certs-default" not found Apr 21 03:59:32.338719 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:32.338679 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-pfkbf"] Apr 21 03:59:32.341924 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:32.341904 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-pfkbf" Apr 21 03:59:32.344563 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:32.344536 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 21 03:59:32.344714 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:32.344536 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 21 03:59:32.344714 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:32.344600 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 21 03:59:32.344714 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:32.344614 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-ztg4s\"" Apr 21 03:59:32.345651 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:32.345637 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 21 03:59:32.351326 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:32.351267 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-pfkbf"] Apr 21 03:59:32.434956 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:32.434921 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/047ff53e-3808-49b6-ad81-7bd15d251053-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-pfkbf\" (UID: \"047ff53e-3808-49b6-ad81-7bd15d251053\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-pfkbf" Apr 21 03:59:32.435146 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:32.434968 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/047ff53e-3808-49b6-ad81-7bd15d251053-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-pfkbf\" (UID: \"047ff53e-3808-49b6-ad81-7bd15d251053\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-pfkbf" Apr 21 03:59:32.435146 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:32.435085 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4q7d6\" (UniqueName: \"kubernetes.io/projected/047ff53e-3808-49b6-ad81-7bd15d251053-kube-api-access-4q7d6\") pod \"kube-storage-version-migrator-operator-6769c5d45-pfkbf\" (UID: \"047ff53e-3808-49b6-ad81-7bd15d251053\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-pfkbf" Apr 21 03:59:32.535700 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:32.535662 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/047ff53e-3808-49b6-ad81-7bd15d251053-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-pfkbf\" (UID: \"047ff53e-3808-49b6-ad81-7bd15d251053\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-pfkbf" Apr 21 03:59:32.535831 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:32.535713 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/047ff53e-3808-49b6-ad81-7bd15d251053-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-pfkbf\" (UID: \"047ff53e-3808-49b6-ad81-7bd15d251053\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-pfkbf" Apr 21 03:59:32.535831 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:32.535767 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4q7d6\" (UniqueName: \"kubernetes.io/projected/047ff53e-3808-49b6-ad81-7bd15d251053-kube-api-access-4q7d6\") pod \"kube-storage-version-migrator-operator-6769c5d45-pfkbf\" (UID: \"047ff53e-3808-49b6-ad81-7bd15d251053\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-pfkbf" Apr 21 03:59:32.536352 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:32.536320 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/047ff53e-3808-49b6-ad81-7bd15d251053-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-pfkbf\" (UID: \"047ff53e-3808-49b6-ad81-7bd15d251053\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-pfkbf" Apr 21 03:59:32.537875 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:32.537856 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/047ff53e-3808-49b6-ad81-7bd15d251053-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-pfkbf\" (UID: \"047ff53e-3808-49b6-ad81-7bd15d251053\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-pfkbf" Apr 21 03:59:32.543574 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:32.543547 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4q7d6\" (UniqueName: \"kubernetes.io/projected/047ff53e-3808-49b6-ad81-7bd15d251053-kube-api-access-4q7d6\") pod \"kube-storage-version-migrator-operator-6769c5d45-pfkbf\" (UID: \"047ff53e-3808-49b6-ad81-7bd15d251053\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-pfkbf" Apr 21 03:59:32.651551 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:32.651449 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-pfkbf" Apr 21 03:59:32.766717 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:32.766684 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-pfkbf"] Apr 21 03:59:32.769921 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:59:32.769891 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod047ff53e_3808_49b6_ad81_7bd15d251053.slice/crio-780fc6185eab7011da490851d047ae914f33e115f27d1066b10eb72085c1c8d9 WatchSource:0}: Error finding container 780fc6185eab7011da490851d047ae914f33e115f27d1066b10eb72085c1c8d9: Status 404 returned error can't find the container with id 780fc6185eab7011da490851d047ae914f33e115f27d1066b10eb72085c1c8d9 Apr 21 03:59:33.292587 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:33.292551 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-pfkbf" event={"ID":"047ff53e-3808-49b6-ad81-7bd15d251053","Type":"ContainerStarted","Data":"780fc6185eab7011da490851d047ae914f33e115f27d1066b10eb72085c1c8d9"} Apr 21 03:59:35.298210 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:35.298170 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-pfkbf" event={"ID":"047ff53e-3808-49b6-ad81-7bd15d251053","Type":"ContainerStarted","Data":"9cceb391afbd9e1e3d468d358d3d76edac154e43cef76960b881ae2dec7afb4e"} Apr 21 03:59:35.315641 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:35.315587 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-pfkbf" podStartSLOduration=1.78667454 podStartE2EDuration="3.31557058s" podCreationTimestamp="2026-04-21 03:59:32 +0000 UTC" firstStartedPulling="2026-04-21 03:59:32.771596702 +0000 UTC m=+116.441035755" lastFinishedPulling="2026-04-21 03:59:34.30049273 +0000 UTC m=+117.969931795" observedRunningTime="2026-04-21 03:59:35.313852116 +0000 UTC m=+118.983291193" watchObservedRunningTime="2026-04-21 03:59:35.31557058 +0000 UTC m=+118.985009655" Apr 21 03:59:35.794970 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:35.794923 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-b2x7m"] Apr 21 03:59:35.798371 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:35.798346 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-b2x7m" Apr 21 03:59:35.801141 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:35.801120 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-v6bsq\"" Apr 21 03:59:35.801242 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:35.801155 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 21 03:59:35.802146 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:35.802132 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 21 03:59:35.808622 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:35.808602 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-b2x7m"] Apr 21 03:59:35.961099 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:35.961060 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frr5b\" (UniqueName: \"kubernetes.io/projected/5f93c525-8d0e-4df1-99c5-c8091900d3af-kube-api-access-frr5b\") pod \"migrator-74bb7799d9-b2x7m\" (UID: \"5f93c525-8d0e-4df1-99c5-c8091900d3af\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-b2x7m" Apr 21 03:59:36.061887 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:36.061784 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-frr5b\" (UniqueName: \"kubernetes.io/projected/5f93c525-8d0e-4df1-99c5-c8091900d3af-kube-api-access-frr5b\") pod \"migrator-74bb7799d9-b2x7m\" (UID: \"5f93c525-8d0e-4df1-99c5-c8091900d3af\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-b2x7m" Apr 21 03:59:36.069333 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:36.069298 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-frr5b\" (UniqueName: \"kubernetes.io/projected/5f93c525-8d0e-4df1-99c5-c8091900d3af-kube-api-access-frr5b\") pod \"migrator-74bb7799d9-b2x7m\" (UID: \"5f93c525-8d0e-4df1-99c5-c8091900d3af\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-b2x7m" Apr 21 03:59:36.107501 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:36.107453 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-b2x7m" Apr 21 03:59:36.221940 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:36.221908 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-b2x7m"] Apr 21 03:59:36.224967 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:59:36.224938 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f93c525_8d0e_4df1_99c5_c8091900d3af.slice/crio-268f3b3106d217e49a8b2d8ca110822ad43826787441f32d7706d3202da87d32 WatchSource:0}: Error finding container 268f3b3106d217e49a8b2d8ca110822ad43826787441f32d7706d3202da87d32: Status 404 returned error can't find the container with id 268f3b3106d217e49a8b2d8ca110822ad43826787441f32d7706d3202da87d32 Apr 21 03:59:36.300997 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:36.300956 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-b2x7m" event={"ID":"5f93c525-8d0e-4df1-99c5-c8091900d3af","Type":"ContainerStarted","Data":"268f3b3106d217e49a8b2d8ca110822ad43826787441f32d7706d3202da87d32"} Apr 21 03:59:37.306053 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:37.306014 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-b2x7m" event={"ID":"5f93c525-8d0e-4df1-99c5-c8091900d3af","Type":"ContainerStarted","Data":"f60261fdd34313587a43198fd5dfb3fdce025e8c25fc76f1dae41458d9a206a9"} Apr 21 03:59:37.306053 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:37.306058 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-b2x7m" event={"ID":"5f93c525-8d0e-4df1-99c5-c8091900d3af","Type":"ContainerStarted","Data":"4ca0673a29035f858b48ae97d2e9b5cceb3cf0a92bcd1f0f41d5bf718f6f1fbd"} Apr 21 03:59:37.322733 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:37.322689 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-b2x7m" podStartSLOduration=1.4541337859999999 podStartE2EDuration="2.322675986s" podCreationTimestamp="2026-04-21 03:59:35 +0000 UTC" firstStartedPulling="2026-04-21 03:59:36.226751497 +0000 UTC m=+119.896190549" lastFinishedPulling="2026-04-21 03:59:37.095293693 +0000 UTC m=+120.764732749" observedRunningTime="2026-04-21 03:59:37.322320291 +0000 UTC m=+120.991759359" watchObservedRunningTime="2026-04-21 03:59:37.322675986 +0000 UTC m=+120.992115065" Apr 21 03:59:39.185951 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:39.185909 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8c7e63b-ee59-4ec9-bf86-150c2642daa7-service-ca-bundle\") pod \"router-default-598f8f588b-lr9v8\" (UID: \"c8c7e63b-ee59-4ec9-bf86-150c2642daa7\") " pod="openshift-ingress/router-default-598f8f588b-lr9v8" Apr 21 03:59:39.186376 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:39.185968 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c8c7e63b-ee59-4ec9-bf86-150c2642daa7-metrics-certs\") pod \"router-default-598f8f588b-lr9v8\" (UID: \"c8c7e63b-ee59-4ec9-bf86-150c2642daa7\") " pod="openshift-ingress/router-default-598f8f588b-lr9v8" Apr 21 03:59:39.186376 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:39.186000 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/474836e5-bd94-4a4c-a1ec-ee329f804cb6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-9mdst\" (UID: \"474836e5-bd94-4a4c-a1ec-ee329f804cb6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9mdst" Apr 21 03:59:39.186376 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:59:39.186076 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c8c7e63b-ee59-4ec9-bf86-150c2642daa7-service-ca-bundle podName:c8c7e63b-ee59-4ec9-bf86-150c2642daa7 nodeName:}" failed. No retries permitted until 2026-04-21 03:59:55.18605843 +0000 UTC m=+138.855497483 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/c8c7e63b-ee59-4ec9-bf86-150c2642daa7-service-ca-bundle") pod "router-default-598f8f588b-lr9v8" (UID: "c8c7e63b-ee59-4ec9-bf86-150c2642daa7") : configmap references non-existent config key: service-ca.crt Apr 21 03:59:39.186376 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:59:39.186123 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 03:59:39.186376 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:59:39.186183 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/474836e5-bd94-4a4c-a1ec-ee329f804cb6-cluster-monitoring-operator-tls podName:474836e5-bd94-4a4c-a1ec-ee329f804cb6 nodeName:}" failed. No retries permitted until 2026-04-21 03:59:55.186171395 +0000 UTC m=+138.855610448 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/474836e5-bd94-4a4c-a1ec-ee329f804cb6-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-9mdst" (UID: "474836e5-bd94-4a4c-a1ec-ee329f804cb6") : secret "cluster-monitoring-operator-tls" not found Apr 21 03:59:39.186376 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:59:39.186123 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 03:59:39.186376 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:59:39.186211 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8c7e63b-ee59-4ec9-bf86-150c2642daa7-metrics-certs podName:c8c7e63b-ee59-4ec9-bf86-150c2642daa7 nodeName:}" failed. No retries permitted until 2026-04-21 03:59:55.186204891 +0000 UTC m=+138.855643945 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c8c7e63b-ee59-4ec9-bf86-150c2642daa7-metrics-certs") pod "router-default-598f8f588b-lr9v8" (UID: "c8c7e63b-ee59-4ec9-bf86-150c2642daa7") : secret "router-metrics-certs-default" not found Apr 21 03:59:45.637628 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:45.637596 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6e98b17f-4794-44aa-8756-58a9bd9cb37a-metrics-certs\") pod \"network-metrics-daemon-2lrq9\" (UID: \"6e98b17f-4794-44aa-8756-58a9bd9cb37a\") " pod="openshift-multus/network-metrics-daemon-2lrq9" Apr 21 03:59:45.638007 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:59:45.637751 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 03:59:45.638007 ip-10-0-134-136 kubenswrapper[2578]: E0421 03:59:45.637827 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e98b17f-4794-44aa-8756-58a9bd9cb37a-metrics-certs podName:6e98b17f-4794-44aa-8756-58a9bd9cb37a nodeName:}" failed. No retries permitted until 2026-04-21 04:01:47.637810836 +0000 UTC m=+251.307249889 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6e98b17f-4794-44aa-8756-58a9bd9cb37a-metrics-certs") pod "network-metrics-daemon-2lrq9" (UID: "6e98b17f-4794-44aa-8756-58a9bd9cb37a") : secret "metrics-daemon-secret" not found Apr 21 03:59:55.209086 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:55.209031 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8c7e63b-ee59-4ec9-bf86-150c2642daa7-service-ca-bundle\") pod \"router-default-598f8f588b-lr9v8\" (UID: \"c8c7e63b-ee59-4ec9-bf86-150c2642daa7\") " pod="openshift-ingress/router-default-598f8f588b-lr9v8" Apr 21 03:59:55.209680 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:55.209118 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c8c7e63b-ee59-4ec9-bf86-150c2642daa7-metrics-certs\") pod \"router-default-598f8f588b-lr9v8\" (UID: \"c8c7e63b-ee59-4ec9-bf86-150c2642daa7\") " pod="openshift-ingress/router-default-598f8f588b-lr9v8" Apr 21 03:59:55.209680 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:55.209152 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/474836e5-bd94-4a4c-a1ec-ee329f804cb6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-9mdst\" (UID: \"474836e5-bd94-4a4c-a1ec-ee329f804cb6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9mdst" Apr 21 03:59:55.209680 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:55.209638 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8c7e63b-ee59-4ec9-bf86-150c2642daa7-service-ca-bundle\") pod \"router-default-598f8f588b-lr9v8\" (UID: \"c8c7e63b-ee59-4ec9-bf86-150c2642daa7\") " pod="openshift-ingress/router-default-598f8f588b-lr9v8" Apr 21 03:59:55.211590 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:55.211565 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c8c7e63b-ee59-4ec9-bf86-150c2642daa7-metrics-certs\") pod \"router-default-598f8f588b-lr9v8\" (UID: \"c8c7e63b-ee59-4ec9-bf86-150c2642daa7\") " pod="openshift-ingress/router-default-598f8f588b-lr9v8" Apr 21 03:59:55.211709 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:55.211615 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/474836e5-bd94-4a4c-a1ec-ee329f804cb6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-9mdst\" (UID: \"474836e5-bd94-4a4c-a1ec-ee329f804cb6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9mdst" Apr 21 03:59:55.450599 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:55.450561 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-vlbxh\"" Apr 21 03:59:55.456293 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:55.456259 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-dzn5d\"" Apr 21 03:59:55.457311 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:55.457294 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9mdst" Apr 21 03:59:55.466256 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:55.466119 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-598f8f588b-lr9v8" Apr 21 03:59:55.581105 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:55.581077 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-9mdst"] Apr 21 03:59:55.583986 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:59:55.583961 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod474836e5_bd94_4a4c_a1ec_ee329f804cb6.slice/crio-3aca7fdd17b9f5163d054710f1c062e6028062b25236f6cf72493f412ca0f463 WatchSource:0}: Error finding container 3aca7fdd17b9f5163d054710f1c062e6028062b25236f6cf72493f412ca0f463: Status 404 returned error can't find the container with id 3aca7fdd17b9f5163d054710f1c062e6028062b25236f6cf72493f412ca0f463 Apr 21 03:59:55.596296 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:55.596255 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-598f8f588b-lr9v8"] Apr 21 03:59:55.599468 ip-10-0-134-136 kubenswrapper[2578]: W0421 03:59:55.599441 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8c7e63b_ee59_4ec9_bf86_150c2642daa7.slice/crio-73a51ba6d54f820b43013ba8bdf1280d01a475bd871465b3de6acb27a26dd46b WatchSource:0}: Error finding container 73a51ba6d54f820b43013ba8bdf1280d01a475bd871465b3de6acb27a26dd46b: Status 404 returned error can't find the container with id 73a51ba6d54f820b43013ba8bdf1280d01a475bd871465b3de6acb27a26dd46b Apr 21 03:59:56.350827 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:56.350750 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-598f8f588b-lr9v8" event={"ID":"c8c7e63b-ee59-4ec9-bf86-150c2642daa7","Type":"ContainerStarted","Data":"cddafcfcbe562b141b2603722ddb88ee075d989ad5a7d984af637b904904099e"} Apr 21 03:59:56.350827 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:56.350799 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-598f8f588b-lr9v8" event={"ID":"c8c7e63b-ee59-4ec9-bf86-150c2642daa7","Type":"ContainerStarted","Data":"73a51ba6d54f820b43013ba8bdf1280d01a475bd871465b3de6acb27a26dd46b"} Apr 21 03:59:56.351933 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:56.351895 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9mdst" event={"ID":"474836e5-bd94-4a4c-a1ec-ee329f804cb6","Type":"ContainerStarted","Data":"3aca7fdd17b9f5163d054710f1c062e6028062b25236f6cf72493f412ca0f463"} Apr 21 03:59:56.370110 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:56.370054 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-598f8f588b-lr9v8" podStartSLOduration=33.370037321 podStartE2EDuration="33.370037321s" podCreationTimestamp="2026-04-21 03:59:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 03:59:56.369324714 +0000 UTC m=+140.038763790" watchObservedRunningTime="2026-04-21 03:59:56.370037321 +0000 UTC m=+140.039476396" Apr 21 03:59:56.466391 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:56.466352 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-598f8f588b-lr9v8" Apr 21 03:59:56.469464 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:56.469437 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-598f8f588b-lr9v8" Apr 21 03:59:57.355730 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:57.355690 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9mdst" event={"ID":"474836e5-bd94-4a4c-a1ec-ee329f804cb6","Type":"ContainerStarted","Data":"31fdb66a086f4d39c34b5465e8dfe9bf4c7c1c825f57f8779ef00a5e17f2edd4"} Apr 21 03:59:57.356149 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:57.355794 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-598f8f588b-lr9v8" Apr 21 03:59:57.357040 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:57.357019 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-598f8f588b-lr9v8" Apr 21 03:59:57.385081 ip-10-0-134-136 kubenswrapper[2578]: I0421 03:59:57.385026 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9mdst" podStartSLOduration=32.81583628 podStartE2EDuration="34.38501137s" podCreationTimestamp="2026-04-21 03:59:23 +0000 UTC" firstStartedPulling="2026-04-21 03:59:55.58567061 +0000 UTC m=+139.255109663" lastFinishedPulling="2026-04-21 03:59:57.154845695 +0000 UTC m=+140.824284753" observedRunningTime="2026-04-21 03:59:57.3841113 +0000 UTC m=+141.053550375" watchObservedRunningTime="2026-04-21 03:59:57.38501137 +0000 UTC m=+141.054450445" Apr 21 04:00:00.233419 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.233389 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-qsn72"] Apr 21 04:00:00.236476 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.236452 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-qsn72" Apr 21 04:00:00.241151 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.241128 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-tgq7l\"" Apr 21 04:00:00.241300 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.241127 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 04:00:00.241705 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.241676 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 04:00:00.241805 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.241723 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 04:00:00.242069 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.242052 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 04:00:00.245767 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.245744 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f7da8ca3-9b52-4d03-bf6c-2d7bd826ccb1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-qsn72\" (UID: \"f7da8ca3-9b52-4d03-bf6c-2d7bd826ccb1\") " pod="openshift-insights/insights-runtime-extractor-qsn72" Apr 21 04:00:00.245888 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.245781 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4czv\" (UniqueName: \"kubernetes.io/projected/f7da8ca3-9b52-4d03-bf6c-2d7bd826ccb1-kube-api-access-s4czv\") pod \"insights-runtime-extractor-qsn72\" (UID: \"f7da8ca3-9b52-4d03-bf6c-2d7bd826ccb1\") " pod="openshift-insights/insights-runtime-extractor-qsn72" Apr 21 04:00:00.245888 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.245810 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f7da8ca3-9b52-4d03-bf6c-2d7bd826ccb1-crio-socket\") pod \"insights-runtime-extractor-qsn72\" (UID: \"f7da8ca3-9b52-4d03-bf6c-2d7bd826ccb1\") " pod="openshift-insights/insights-runtime-extractor-qsn72" Apr 21 04:00:00.245888 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.245833 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f7da8ca3-9b52-4d03-bf6c-2d7bd826ccb1-data-volume\") pod \"insights-runtime-extractor-qsn72\" (UID: \"f7da8ca3-9b52-4d03-bf6c-2d7bd826ccb1\") " pod="openshift-insights/insights-runtime-extractor-qsn72" Apr 21 04:00:00.246015 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.245888 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f7da8ca3-9b52-4d03-bf6c-2d7bd826ccb1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qsn72\" (UID: \"f7da8ca3-9b52-4d03-bf6c-2d7bd826ccb1\") " pod="openshift-insights/insights-runtime-extractor-qsn72" Apr 21 04:00:00.249591 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.249564 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-qsn72"] Apr 21 04:00:00.323141 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.323108 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-qf2jm"] Apr 21 04:00:00.326012 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.325992 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66d4d5dd58-9ltmm"] Apr 21 04:00:00.326158 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.326142 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-qf2jm" Apr 21 04:00:00.328775 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.328757 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66d4d5dd58-9ltmm" Apr 21 04:00:00.329320 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.329302 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-hmtxj\"" Apr 21 04:00:00.329853 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.329834 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 21 04:00:00.329925 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.329841 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 21 04:00:00.331014 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.330998 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 21 04:00:00.331588 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.331571 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-fzhgd\"" Apr 21 04:00:00.331691 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.331634 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 21 04:00:00.331968 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.331951 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 21 04:00:00.340224 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.340202 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 21 04:00:00.343604 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.343579 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-qf2jm"] Apr 21 04:00:00.346536 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.346499 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66d4d5dd58-9ltmm"] Apr 21 04:00:00.346666 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.346584 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2cdd1b89-0dd4-4456-a8ac-6b4888631b60-registry-tls\") pod \"image-registry-66d4d5dd58-9ltmm\" (UID: \"2cdd1b89-0dd4-4456-a8ac-6b4888631b60\") " pod="openshift-image-registry/image-registry-66d4d5dd58-9ltmm" Apr 21 04:00:00.346666 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.346618 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f7da8ca3-9b52-4d03-bf6c-2d7bd826ccb1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qsn72\" (UID: \"f7da8ca3-9b52-4d03-bf6c-2d7bd826ccb1\") " pod="openshift-insights/insights-runtime-extractor-qsn72" Apr 21 04:00:00.346666 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.346643 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2cdd1b89-0dd4-4456-a8ac-6b4888631b60-image-registry-private-configuration\") pod \"image-registry-66d4d5dd58-9ltmm\" (UID: \"2cdd1b89-0dd4-4456-a8ac-6b4888631b60\") " pod="openshift-image-registry/image-registry-66d4d5dd58-9ltmm" Apr 21 04:00:00.346824 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.346668 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2cdd1b89-0dd4-4456-a8ac-6b4888631b60-registry-certificates\") pod \"image-registry-66d4d5dd58-9ltmm\" (UID: \"2cdd1b89-0dd4-4456-a8ac-6b4888631b60\") " pod="openshift-image-registry/image-registry-66d4d5dd58-9ltmm" Apr 21 04:00:00.346824 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.346743 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s4czv\" (UniqueName: \"kubernetes.io/projected/f7da8ca3-9b52-4d03-bf6c-2d7bd826ccb1-kube-api-access-s4czv\") pod \"insights-runtime-extractor-qsn72\" (UID: \"f7da8ca3-9b52-4d03-bf6c-2d7bd826ccb1\") " pod="openshift-insights/insights-runtime-extractor-qsn72" Apr 21 04:00:00.346824 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.346768 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2cdd1b89-0dd4-4456-a8ac-6b4888631b60-bound-sa-token\") pod \"image-registry-66d4d5dd58-9ltmm\" (UID: \"2cdd1b89-0dd4-4456-a8ac-6b4888631b60\") " pod="openshift-image-registry/image-registry-66d4d5dd58-9ltmm" Apr 21 04:00:00.346824 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.346821 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f7da8ca3-9b52-4d03-bf6c-2d7bd826ccb1-data-volume\") pod \"insights-runtime-extractor-qsn72\" (UID: \"f7da8ca3-9b52-4d03-bf6c-2d7bd826ccb1\") " pod="openshift-insights/insights-runtime-extractor-qsn72" Apr 21 04:00:00.346991 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.346854 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2cdd1b89-0dd4-4456-a8ac-6b4888631b60-trusted-ca\") pod \"image-registry-66d4d5dd58-9ltmm\" (UID: \"2cdd1b89-0dd4-4456-a8ac-6b4888631b60\") " pod="openshift-image-registry/image-registry-66d4d5dd58-9ltmm" Apr 21 04:00:00.346991 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.346885 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f7da8ca3-9b52-4d03-bf6c-2d7bd826ccb1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-qsn72\" (UID: \"f7da8ca3-9b52-4d03-bf6c-2d7bd826ccb1\") " pod="openshift-insights/insights-runtime-extractor-qsn72" Apr 21 04:00:00.346991 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.346911 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4j8hg\" (UniqueName: \"kubernetes.io/projected/2cdd1b89-0dd4-4456-a8ac-6b4888631b60-kube-api-access-4j8hg\") pod \"image-registry-66d4d5dd58-9ltmm\" (UID: \"2cdd1b89-0dd4-4456-a8ac-6b4888631b60\") " pod="openshift-image-registry/image-registry-66d4d5dd58-9ltmm" Apr 21 04:00:00.346991 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.346941 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f7da8ca3-9b52-4d03-bf6c-2d7bd826ccb1-crio-socket\") pod \"insights-runtime-extractor-qsn72\" (UID: \"f7da8ca3-9b52-4d03-bf6c-2d7bd826ccb1\") " pod="openshift-insights/insights-runtime-extractor-qsn72" Apr 21 04:00:00.346991 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.346967 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvvpc\" (UniqueName: \"kubernetes.io/projected/07645b09-21c7-466b-a46d-b48a72d9c654-kube-api-access-cvvpc\") pod \"downloads-6bcc868b7-qf2jm\" (UID: \"07645b09-21c7-466b-a46d-b48a72d9c654\") " pod="openshift-console/downloads-6bcc868b7-qf2jm" Apr 21 04:00:00.347212 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.347001 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2cdd1b89-0dd4-4456-a8ac-6b4888631b60-ca-trust-extracted\") pod \"image-registry-66d4d5dd58-9ltmm\" (UID: \"2cdd1b89-0dd4-4456-a8ac-6b4888631b60\") " pod="openshift-image-registry/image-registry-66d4d5dd58-9ltmm" Apr 21 04:00:00.347212 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.347030 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2cdd1b89-0dd4-4456-a8ac-6b4888631b60-installation-pull-secrets\") pod \"image-registry-66d4d5dd58-9ltmm\" (UID: \"2cdd1b89-0dd4-4456-a8ac-6b4888631b60\") " pod="openshift-image-registry/image-registry-66d4d5dd58-9ltmm" Apr 21 04:00:00.347212 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.347063 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f7da8ca3-9b52-4d03-bf6c-2d7bd826ccb1-crio-socket\") pod \"insights-runtime-extractor-qsn72\" (UID: \"f7da8ca3-9b52-4d03-bf6c-2d7bd826ccb1\") " pod="openshift-insights/insights-runtime-extractor-qsn72" Apr 21 04:00:00.347373 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.347221 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f7da8ca3-9b52-4d03-bf6c-2d7bd826ccb1-data-volume\") pod \"insights-runtime-extractor-qsn72\" (UID: \"f7da8ca3-9b52-4d03-bf6c-2d7bd826ccb1\") " pod="openshift-insights/insights-runtime-extractor-qsn72" Apr 21 04:00:00.347476 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.347458 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f7da8ca3-9b52-4d03-bf6c-2d7bd826ccb1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-qsn72\" (UID: \"f7da8ca3-9b52-4d03-bf6c-2d7bd826ccb1\") " pod="openshift-insights/insights-runtime-extractor-qsn72" Apr 21 04:00:00.349176 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.349155 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f7da8ca3-9b52-4d03-bf6c-2d7bd826ccb1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qsn72\" (UID: \"f7da8ca3-9b52-4d03-bf6c-2d7bd826ccb1\") " pod="openshift-insights/insights-runtime-extractor-qsn72" Apr 21 04:00:00.360340 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.360308 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-66d4d5dd58-9ltmm"] Apr 21 04:00:00.360536 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:00:00.360514 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[bound-sa-token ca-trust-extracted image-registry-private-configuration installation-pull-secrets kube-api-access-4j8hg registry-certificates registry-tls trusted-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-66d4d5dd58-9ltmm" podUID="2cdd1b89-0dd4-4456-a8ac-6b4888631b60" Apr 21 04:00:00.362939 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.362880 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66d4d5dd58-9ltmm" Apr 21 04:00:00.367544 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.367518 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66d4d5dd58-9ltmm" Apr 21 04:00:00.368122 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.368102 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4czv\" (UniqueName: \"kubernetes.io/projected/f7da8ca3-9b52-4d03-bf6c-2d7bd826ccb1-kube-api-access-s4czv\") pod \"insights-runtime-extractor-qsn72\" (UID: \"f7da8ca3-9b52-4d03-bf6c-2d7bd826ccb1\") " pod="openshift-insights/insights-runtime-extractor-qsn72" Apr 21 04:00:00.447346 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.447308 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2cdd1b89-0dd4-4456-a8ac-6b4888631b60-ca-trust-extracted\") pod \"image-registry-66d4d5dd58-9ltmm\" (UID: \"2cdd1b89-0dd4-4456-a8ac-6b4888631b60\") " pod="openshift-image-registry/image-registry-66d4d5dd58-9ltmm" Apr 21 04:00:00.447529 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.447354 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2cdd1b89-0dd4-4456-a8ac-6b4888631b60-installation-pull-secrets\") pod \"image-registry-66d4d5dd58-9ltmm\" (UID: \"2cdd1b89-0dd4-4456-a8ac-6b4888631b60\") " pod="openshift-image-registry/image-registry-66d4d5dd58-9ltmm" Apr 21 04:00:00.447529 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.447382 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2cdd1b89-0dd4-4456-a8ac-6b4888631b60-registry-tls\") pod \"image-registry-66d4d5dd58-9ltmm\" (UID: \"2cdd1b89-0dd4-4456-a8ac-6b4888631b60\") " pod="openshift-image-registry/image-registry-66d4d5dd58-9ltmm" Apr 21 04:00:00.447529 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.447409 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2cdd1b89-0dd4-4456-a8ac-6b4888631b60-image-registry-private-configuration\") pod \"image-registry-66d4d5dd58-9ltmm\" (UID: \"2cdd1b89-0dd4-4456-a8ac-6b4888631b60\") " pod="openshift-image-registry/image-registry-66d4d5dd58-9ltmm" Apr 21 04:00:00.447529 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.447436 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2cdd1b89-0dd4-4456-a8ac-6b4888631b60-registry-certificates\") pod \"image-registry-66d4d5dd58-9ltmm\" (UID: \"2cdd1b89-0dd4-4456-a8ac-6b4888631b60\") " pod="openshift-image-registry/image-registry-66d4d5dd58-9ltmm" Apr 21 04:00:00.447529 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.447495 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2cdd1b89-0dd4-4456-a8ac-6b4888631b60-bound-sa-token\") pod \"image-registry-66d4d5dd58-9ltmm\" (UID: \"2cdd1b89-0dd4-4456-a8ac-6b4888631b60\") " pod="openshift-image-registry/image-registry-66d4d5dd58-9ltmm" Apr 21 04:00:00.447768 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.447553 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2cdd1b89-0dd4-4456-a8ac-6b4888631b60-trusted-ca\") pod \"image-registry-66d4d5dd58-9ltmm\" (UID: \"2cdd1b89-0dd4-4456-a8ac-6b4888631b60\") " pod="openshift-image-registry/image-registry-66d4d5dd58-9ltmm" Apr 21 04:00:00.447768 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.447580 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4j8hg\" (UniqueName: \"kubernetes.io/projected/2cdd1b89-0dd4-4456-a8ac-6b4888631b60-kube-api-access-4j8hg\") pod \"image-registry-66d4d5dd58-9ltmm\" (UID: \"2cdd1b89-0dd4-4456-a8ac-6b4888631b60\") " pod="openshift-image-registry/image-registry-66d4d5dd58-9ltmm" Apr 21 04:00:00.447768 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.447692 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2cdd1b89-0dd4-4456-a8ac-6b4888631b60-ca-trust-extracted\") pod \"image-registry-66d4d5dd58-9ltmm\" (UID: \"2cdd1b89-0dd4-4456-a8ac-6b4888631b60\") " pod="openshift-image-registry/image-registry-66d4d5dd58-9ltmm" Apr 21 04:00:00.447921 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.447818 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cvvpc\" (UniqueName: \"kubernetes.io/projected/07645b09-21c7-466b-a46d-b48a72d9c654-kube-api-access-cvvpc\") pod \"downloads-6bcc868b7-qf2jm\" (UID: \"07645b09-21c7-466b-a46d-b48a72d9c654\") " pod="openshift-console/downloads-6bcc868b7-qf2jm" Apr 21 04:00:00.448397 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.448371 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2cdd1b89-0dd4-4456-a8ac-6b4888631b60-registry-certificates\") pod \"image-registry-66d4d5dd58-9ltmm\" (UID: \"2cdd1b89-0dd4-4456-a8ac-6b4888631b60\") " pod="openshift-image-registry/image-registry-66d4d5dd58-9ltmm" Apr 21 04:00:00.448647 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.448629 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2cdd1b89-0dd4-4456-a8ac-6b4888631b60-trusted-ca\") pod \"image-registry-66d4d5dd58-9ltmm\" (UID: \"2cdd1b89-0dd4-4456-a8ac-6b4888631b60\") " pod="openshift-image-registry/image-registry-66d4d5dd58-9ltmm" Apr 21 04:00:00.449834 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.449813 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2cdd1b89-0dd4-4456-a8ac-6b4888631b60-installation-pull-secrets\") pod \"image-registry-66d4d5dd58-9ltmm\" (UID: \"2cdd1b89-0dd4-4456-a8ac-6b4888631b60\") " pod="openshift-image-registry/image-registry-66d4d5dd58-9ltmm" Apr 21 04:00:00.449972 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.449954 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2cdd1b89-0dd4-4456-a8ac-6b4888631b60-image-registry-private-configuration\") pod \"image-registry-66d4d5dd58-9ltmm\" (UID: \"2cdd1b89-0dd4-4456-a8ac-6b4888631b60\") " pod="openshift-image-registry/image-registry-66d4d5dd58-9ltmm" Apr 21 04:00:00.450196 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.450180 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2cdd1b89-0dd4-4456-a8ac-6b4888631b60-registry-tls\") pod \"image-registry-66d4d5dd58-9ltmm\" (UID: \"2cdd1b89-0dd4-4456-a8ac-6b4888631b60\") " pod="openshift-image-registry/image-registry-66d4d5dd58-9ltmm" Apr 21 04:00:00.457763 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.457729 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2cdd1b89-0dd4-4456-a8ac-6b4888631b60-bound-sa-token\") pod \"image-registry-66d4d5dd58-9ltmm\" (UID: \"2cdd1b89-0dd4-4456-a8ac-6b4888631b60\") " pod="openshift-image-registry/image-registry-66d4d5dd58-9ltmm" Apr 21 04:00:00.457763 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.457746 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvvpc\" (UniqueName: \"kubernetes.io/projected/07645b09-21c7-466b-a46d-b48a72d9c654-kube-api-access-cvvpc\") pod \"downloads-6bcc868b7-qf2jm\" (UID: \"07645b09-21c7-466b-a46d-b48a72d9c654\") " pod="openshift-console/downloads-6bcc868b7-qf2jm" Apr 21 04:00:00.458031 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.458015 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4j8hg\" (UniqueName: \"kubernetes.io/projected/2cdd1b89-0dd4-4456-a8ac-6b4888631b60-kube-api-access-4j8hg\") pod \"image-registry-66d4d5dd58-9ltmm\" (UID: \"2cdd1b89-0dd4-4456-a8ac-6b4888631b60\") " pod="openshift-image-registry/image-registry-66d4d5dd58-9ltmm" Apr 21 04:00:00.545300 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.545247 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-qsn72" Apr 21 04:00:00.548654 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.548632 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2cdd1b89-0dd4-4456-a8ac-6b4888631b60-trusted-ca\") pod \"2cdd1b89-0dd4-4456-a8ac-6b4888631b60\" (UID: \"2cdd1b89-0dd4-4456-a8ac-6b4888631b60\") " Apr 21 04:00:00.548764 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.548666 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2cdd1b89-0dd4-4456-a8ac-6b4888631b60-bound-sa-token\") pod \"2cdd1b89-0dd4-4456-a8ac-6b4888631b60\" (UID: \"2cdd1b89-0dd4-4456-a8ac-6b4888631b60\") " Apr 21 04:00:00.548764 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.548698 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4j8hg\" (UniqueName: \"kubernetes.io/projected/2cdd1b89-0dd4-4456-a8ac-6b4888631b60-kube-api-access-4j8hg\") pod \"2cdd1b89-0dd4-4456-a8ac-6b4888631b60\" (UID: \"2cdd1b89-0dd4-4456-a8ac-6b4888631b60\") " Apr 21 04:00:00.548764 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.548718 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2cdd1b89-0dd4-4456-a8ac-6b4888631b60-registry-tls\") pod \"2cdd1b89-0dd4-4456-a8ac-6b4888631b60\" (UID: \"2cdd1b89-0dd4-4456-a8ac-6b4888631b60\") " Apr 21 04:00:00.548764 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.548745 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2cdd1b89-0dd4-4456-a8ac-6b4888631b60-registry-certificates\") pod \"2cdd1b89-0dd4-4456-a8ac-6b4888631b60\" (UID: \"2cdd1b89-0dd4-4456-a8ac-6b4888631b60\") " Apr 21 04:00:00.548963 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.548794 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2cdd1b89-0dd4-4456-a8ac-6b4888631b60-installation-pull-secrets\") pod \"2cdd1b89-0dd4-4456-a8ac-6b4888631b60\" (UID: \"2cdd1b89-0dd4-4456-a8ac-6b4888631b60\") " Apr 21 04:00:00.548963 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.548822 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2cdd1b89-0dd4-4456-a8ac-6b4888631b60-image-registry-private-configuration\") pod \"2cdd1b89-0dd4-4456-a8ac-6b4888631b60\" (UID: \"2cdd1b89-0dd4-4456-a8ac-6b4888631b60\") " Apr 21 04:00:00.548963 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.548840 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2cdd1b89-0dd4-4456-a8ac-6b4888631b60-ca-trust-extracted\") pod \"2cdd1b89-0dd4-4456-a8ac-6b4888631b60\" (UID: \"2cdd1b89-0dd4-4456-a8ac-6b4888631b60\") " Apr 21 04:00:00.549228 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.549168 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cdd1b89-0dd4-4456-a8ac-6b4888631b60-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "2cdd1b89-0dd4-4456-a8ac-6b4888631b60" (UID: "2cdd1b89-0dd4-4456-a8ac-6b4888631b60"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:00:00.549367 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.549262 2578 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2cdd1b89-0dd4-4456-a8ac-6b4888631b60-ca-trust-extracted\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:00:00.549474 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.549449 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cdd1b89-0dd4-4456-a8ac-6b4888631b60-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "2cdd1b89-0dd4-4456-a8ac-6b4888631b60" (UID: "2cdd1b89-0dd4-4456-a8ac-6b4888631b60"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:00:00.551372 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.551341 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cdd1b89-0dd4-4456-a8ac-6b4888631b60-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "2cdd1b89-0dd4-4456-a8ac-6b4888631b60" (UID: "2cdd1b89-0dd4-4456-a8ac-6b4888631b60"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:00:00.551492 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.551439 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cdd1b89-0dd4-4456-a8ac-6b4888631b60-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "2cdd1b89-0dd4-4456-a8ac-6b4888631b60" (UID: "2cdd1b89-0dd4-4456-a8ac-6b4888631b60"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:00:00.551492 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.551446 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cdd1b89-0dd4-4456-a8ac-6b4888631b60-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "2cdd1b89-0dd4-4456-a8ac-6b4888631b60" (UID: "2cdd1b89-0dd4-4456-a8ac-6b4888631b60"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:00:00.551492 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.551468 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cdd1b89-0dd4-4456-a8ac-6b4888631b60-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "2cdd1b89-0dd4-4456-a8ac-6b4888631b60" (UID: "2cdd1b89-0dd4-4456-a8ac-6b4888631b60"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:00:00.551492 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.551485 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cdd1b89-0dd4-4456-a8ac-6b4888631b60-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "2cdd1b89-0dd4-4456-a8ac-6b4888631b60" (UID: "2cdd1b89-0dd4-4456-a8ac-6b4888631b60"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:00:00.551492 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.551472 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cdd1b89-0dd4-4456-a8ac-6b4888631b60-kube-api-access-4j8hg" (OuterVolumeSpecName: "kube-api-access-4j8hg") pod "2cdd1b89-0dd4-4456-a8ac-6b4888631b60" (UID: "2cdd1b89-0dd4-4456-a8ac-6b4888631b60"). InnerVolumeSpecName "kube-api-access-4j8hg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:00:00.636721 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.636224 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-qf2jm" Apr 21 04:00:00.649899 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.649869 2578 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2cdd1b89-0dd4-4456-a8ac-6b4888631b60-installation-pull-secrets\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:00:00.649899 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.649895 2578 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2cdd1b89-0dd4-4456-a8ac-6b4888631b60-image-registry-private-configuration\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:00:00.650051 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.649907 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2cdd1b89-0dd4-4456-a8ac-6b4888631b60-trusted-ca\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:00:00.650051 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.649916 2578 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2cdd1b89-0dd4-4456-a8ac-6b4888631b60-bound-sa-token\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:00:00.650051 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.649926 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4j8hg\" (UniqueName: \"kubernetes.io/projected/2cdd1b89-0dd4-4456-a8ac-6b4888631b60-kube-api-access-4j8hg\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:00:00.650051 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.649934 2578 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2cdd1b89-0dd4-4456-a8ac-6b4888631b60-registry-tls\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:00:00.650051 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.649943 2578 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2cdd1b89-0dd4-4456-a8ac-6b4888631b60-registry-certificates\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:00:00.676361 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.676302 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-qsn72"] Apr 21 04:00:00.679998 ip-10-0-134-136 kubenswrapper[2578]: W0421 04:00:00.679954 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7da8ca3_9b52_4d03_bf6c_2d7bd826ccb1.slice/crio-4c349572734310e8059094425446d9a1f3609e7fc984421f1a76b6ba6c200c7e WatchSource:0}: Error finding container 4c349572734310e8059094425446d9a1f3609e7fc984421f1a76b6ba6c200c7e: Status 404 returned error can't find the container with id 4c349572734310e8059094425446d9a1f3609e7fc984421f1a76b6ba6c200c7e Apr 21 04:00:00.765167 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:00.765133 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-qf2jm"] Apr 21 04:00:00.768810 ip-10-0-134-136 kubenswrapper[2578]: W0421 04:00:00.768781 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07645b09_21c7_466b_a46d_b48a72d9c654.slice/crio-13ae174323cde8a251ce07cfc3f13fb5c52d5c83c5d3c6ee3412adca2af14bc4 WatchSource:0}: Error finding container 13ae174323cde8a251ce07cfc3f13fb5c52d5c83c5d3c6ee3412adca2af14bc4: Status 404 returned error can't find the container with id 13ae174323cde8a251ce07cfc3f13fb5c52d5c83c5d3c6ee3412adca2af14bc4 Apr 21 04:00:01.366720 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:01.366675 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-qf2jm" event={"ID":"07645b09-21c7-466b-a46d-b48a72d9c654","Type":"ContainerStarted","Data":"13ae174323cde8a251ce07cfc3f13fb5c52d5c83c5d3c6ee3412adca2af14bc4"} Apr 21 04:00:01.368215 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:01.368191 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66d4d5dd58-9ltmm" Apr 21 04:00:01.368362 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:01.368177 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qsn72" event={"ID":"f7da8ca3-9b52-4d03-bf6c-2d7bd826ccb1","Type":"ContainerStarted","Data":"e512e041456ed05cd7f6a2149daf2c024c8bac600249dcda8caf9ff99b32871b"} Apr 21 04:00:01.368424 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:01.368373 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qsn72" event={"ID":"f7da8ca3-9b52-4d03-bf6c-2d7bd826ccb1","Type":"ContainerStarted","Data":"4c349572734310e8059094425446d9a1f3609e7fc984421f1a76b6ba6c200c7e"} Apr 21 04:00:01.397439 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:01.397413 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-66d4d5dd58-9ltmm"] Apr 21 04:00:01.400431 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:01.400399 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-66d4d5dd58-9ltmm"] Apr 21 04:00:02.373337 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:02.373293 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qsn72" event={"ID":"f7da8ca3-9b52-4d03-bf6c-2d7bd826ccb1","Type":"ContainerStarted","Data":"83a7f6345f281169b246c45aef98d20a04f31680fedb9e6e07d863cadda8b4c5"} Apr 21 04:00:02.928242 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:02.928204 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cdd1b89-0dd4-4456-a8ac-6b4888631b60" path="/var/lib/kubelet/pods/2cdd1b89-0dd4-4456-a8ac-6b4888631b60/volumes" Apr 21 04:00:03.378512 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:03.378474 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qsn72" event={"ID":"f7da8ca3-9b52-4d03-bf6c-2d7bd826ccb1","Type":"ContainerStarted","Data":"430fb462de97b067e7ef9cd964ff78f0a1ef6391ff8c00aec92167ccad8b41ef"} Apr 21 04:00:03.397348 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:03.397260 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-qsn72" podStartSLOduration=0.965243509 podStartE2EDuration="3.397239519s" podCreationTimestamp="2026-04-21 04:00:00 +0000 UTC" firstStartedPulling="2026-04-21 04:00:00.744584543 +0000 UTC m=+144.414023596" lastFinishedPulling="2026-04-21 04:00:03.17658054 +0000 UTC m=+146.846019606" observedRunningTime="2026-04-21 04:00:03.395436335 +0000 UTC m=+147.064875410" watchObservedRunningTime="2026-04-21 04:00:03.397239519 +0000 UTC m=+147.066678596" Apr 21 04:00:03.680890 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:03.680855 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-jrv6d"] Apr 21 04:00:03.684294 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:03.684256 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-jrv6d" Apr 21 04:00:03.686930 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:03.686799 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 21 04:00:03.686930 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:03.686808 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 04:00:03.686930 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:03.686799 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-ng4zp\"" Apr 21 04:00:03.686930 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:03.686827 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 21 04:00:03.690728 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:03.690707 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-jrv6d"] Apr 21 04:00:03.774256 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:03.774217 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/5cbfce1e-fdfb-4e8b-b9d6-626294c831f0-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-jrv6d\" (UID: \"5cbfce1e-fdfb-4e8b-b9d6-626294c831f0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jrv6d" Apr 21 04:00:03.774454 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:03.774268 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5cbfce1e-fdfb-4e8b-b9d6-626294c831f0-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-jrv6d\" (UID: \"5cbfce1e-fdfb-4e8b-b9d6-626294c831f0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jrv6d" Apr 21 04:00:03.774454 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:03.774355 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5cbfce1e-fdfb-4e8b-b9d6-626294c831f0-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-jrv6d\" (UID: \"5cbfce1e-fdfb-4e8b-b9d6-626294c831f0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jrv6d" Apr 21 04:00:03.774454 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:03.774397 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m49th\" (UniqueName: \"kubernetes.io/projected/5cbfce1e-fdfb-4e8b-b9d6-626294c831f0-kube-api-access-m49th\") pod \"prometheus-operator-5676c8c784-jrv6d\" (UID: \"5cbfce1e-fdfb-4e8b-b9d6-626294c831f0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jrv6d" Apr 21 04:00:03.875375 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:03.875339 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/5cbfce1e-fdfb-4e8b-b9d6-626294c831f0-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-jrv6d\" (UID: \"5cbfce1e-fdfb-4e8b-b9d6-626294c831f0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jrv6d" Apr 21 04:00:03.875375 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:03.875378 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5cbfce1e-fdfb-4e8b-b9d6-626294c831f0-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-jrv6d\" (UID: \"5cbfce1e-fdfb-4e8b-b9d6-626294c831f0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jrv6d" Apr 21 04:00:03.875602 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:03.875408 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5cbfce1e-fdfb-4e8b-b9d6-626294c831f0-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-jrv6d\" (UID: \"5cbfce1e-fdfb-4e8b-b9d6-626294c831f0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jrv6d" Apr 21 04:00:03.875602 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:03.875433 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m49th\" (UniqueName: \"kubernetes.io/projected/5cbfce1e-fdfb-4e8b-b9d6-626294c831f0-kube-api-access-m49th\") pod \"prometheus-operator-5676c8c784-jrv6d\" (UID: \"5cbfce1e-fdfb-4e8b-b9d6-626294c831f0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jrv6d" Apr 21 04:00:03.875602 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:00:03.875520 2578 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 21 04:00:03.875602 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:00:03.875594 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5cbfce1e-fdfb-4e8b-b9d6-626294c831f0-prometheus-operator-tls podName:5cbfce1e-fdfb-4e8b-b9d6-626294c831f0 nodeName:}" failed. No retries permitted until 2026-04-21 04:00:04.375572698 +0000 UTC m=+148.045011756 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/5cbfce1e-fdfb-4e8b-b9d6-626294c831f0-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-jrv6d" (UID: "5cbfce1e-fdfb-4e8b-b9d6-626294c831f0") : secret "prometheus-operator-tls" not found Apr 21 04:00:03.876189 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:03.876162 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5cbfce1e-fdfb-4e8b-b9d6-626294c831f0-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-jrv6d\" (UID: \"5cbfce1e-fdfb-4e8b-b9d6-626294c831f0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jrv6d" Apr 21 04:00:03.877994 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:03.877969 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5cbfce1e-fdfb-4e8b-b9d6-626294c831f0-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-jrv6d\" (UID: \"5cbfce1e-fdfb-4e8b-b9d6-626294c831f0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jrv6d" Apr 21 04:00:03.885755 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:03.885725 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m49th\" (UniqueName: \"kubernetes.io/projected/5cbfce1e-fdfb-4e8b-b9d6-626294c831f0-kube-api-access-m49th\") pod \"prometheus-operator-5676c8c784-jrv6d\" (UID: \"5cbfce1e-fdfb-4e8b-b9d6-626294c831f0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jrv6d" Apr 21 04:00:04.379649 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:04.379591 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/5cbfce1e-fdfb-4e8b-b9d6-626294c831f0-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-jrv6d\" (UID: \"5cbfce1e-fdfb-4e8b-b9d6-626294c831f0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jrv6d" Apr 21 04:00:04.382324 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:04.382295 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/5cbfce1e-fdfb-4e8b-b9d6-626294c831f0-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-jrv6d\" (UID: \"5cbfce1e-fdfb-4e8b-b9d6-626294c831f0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jrv6d" Apr 21 04:00:04.596663 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:04.596617 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-jrv6d" Apr 21 04:00:04.728980 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:04.728946 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-jrv6d"] Apr 21 04:00:04.732299 ip-10-0-134-136 kubenswrapper[2578]: W0421 04:00:04.732249 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5cbfce1e_fdfb_4e8b_b9d6_626294c831f0.slice/crio-de4099241176bb3d7e19b4edff068fa2eaa96dce9339e5f5aa52a2bde8be7331 WatchSource:0}: Error finding container de4099241176bb3d7e19b4edff068fa2eaa96dce9339e5f5aa52a2bde8be7331: Status 404 returned error can't find the container with id de4099241176bb3d7e19b4edff068fa2eaa96dce9339e5f5aa52a2bde8be7331 Apr 21 04:00:05.385873 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:05.385826 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-jrv6d" event={"ID":"5cbfce1e-fdfb-4e8b-b9d6-626294c831f0","Type":"ContainerStarted","Data":"de4099241176bb3d7e19b4edff068fa2eaa96dce9339e5f5aa52a2bde8be7331"} Apr 21 04:00:06.390098 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:06.390056 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-jrv6d" event={"ID":"5cbfce1e-fdfb-4e8b-b9d6-626294c831f0","Type":"ContainerStarted","Data":"f8e02a8741dd5f2fc41ca7fa5003cd0776b35e48e9f56b9eafc28e76e2630a92"} Apr 21 04:00:06.390098 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:06.390100 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-jrv6d" event={"ID":"5cbfce1e-fdfb-4e8b-b9d6-626294c831f0","Type":"ContainerStarted","Data":"b08045f75ad1f2243f38930f705cc96a3ac45b258ca0ed2d6473497b0325290e"} Apr 21 04:00:06.407483 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:06.407434 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-jrv6d" podStartSLOduration=2.299041355 podStartE2EDuration="3.407418217s" podCreationTimestamp="2026-04-21 04:00:03 +0000 UTC" firstStartedPulling="2026-04-21 04:00:04.734357411 +0000 UTC m=+148.403796465" lastFinishedPulling="2026-04-21 04:00:05.842734263 +0000 UTC m=+149.512173327" observedRunningTime="2026-04-21 04:00:06.405615136 +0000 UTC m=+150.075054207" watchObservedRunningTime="2026-04-21 04:00:06.407418217 +0000 UTC m=+150.076857282" Apr 21 04:00:08.057071 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:08.057039 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-zmnts"] Apr 21 04:00:08.060487 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:08.060457 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zmnts" Apr 21 04:00:08.062701 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:08.062671 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 04:00:08.062867 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:08.062845 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-d559f\"" Apr 21 04:00:08.062921 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:08.062853 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 04:00:08.063000 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:08.062980 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 04:00:08.115627 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:08.115591 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/aa4733fb-75e2-4c77-bd5d-7ea904966689-node-exporter-textfile\") pod \"node-exporter-zmnts\" (UID: \"aa4733fb-75e2-4c77-bd5d-7ea904966689\") " pod="openshift-monitoring/node-exporter-zmnts" Apr 21 04:00:08.115796 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:08.115644 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/aa4733fb-75e2-4c77-bd5d-7ea904966689-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zmnts\" (UID: \"aa4733fb-75e2-4c77-bd5d-7ea904966689\") " pod="openshift-monitoring/node-exporter-zmnts" Apr 21 04:00:08.115857 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:08.115790 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/aa4733fb-75e2-4c77-bd5d-7ea904966689-root\") pod \"node-exporter-zmnts\" (UID: \"aa4733fb-75e2-4c77-bd5d-7ea904966689\") " pod="openshift-monitoring/node-exporter-zmnts" Apr 21 04:00:08.115857 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:08.115831 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/aa4733fb-75e2-4c77-bd5d-7ea904966689-node-exporter-accelerators-collector-config\") pod \"node-exporter-zmnts\" (UID: \"aa4733fb-75e2-4c77-bd5d-7ea904966689\") " pod="openshift-monitoring/node-exporter-zmnts" Apr 21 04:00:08.115967 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:08.115900 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/aa4733fb-75e2-4c77-bd5d-7ea904966689-node-exporter-tls\") pod \"node-exporter-zmnts\" (UID: \"aa4733fb-75e2-4c77-bd5d-7ea904966689\") " pod="openshift-monitoring/node-exporter-zmnts" Apr 21 04:00:08.115967 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:08.115939 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqpmn\" (UniqueName: \"kubernetes.io/projected/aa4733fb-75e2-4c77-bd5d-7ea904966689-kube-api-access-mqpmn\") pod \"node-exporter-zmnts\" (UID: \"aa4733fb-75e2-4c77-bd5d-7ea904966689\") " pod="openshift-monitoring/node-exporter-zmnts" Apr 21 04:00:08.116066 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:08.115974 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/aa4733fb-75e2-4c77-bd5d-7ea904966689-sys\") pod \"node-exporter-zmnts\" (UID: \"aa4733fb-75e2-4c77-bd5d-7ea904966689\") " pod="openshift-monitoring/node-exporter-zmnts" Apr 21 04:00:08.116066 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:08.116000 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/aa4733fb-75e2-4c77-bd5d-7ea904966689-node-exporter-wtmp\") pod \"node-exporter-zmnts\" (UID: \"aa4733fb-75e2-4c77-bd5d-7ea904966689\") " pod="openshift-monitoring/node-exporter-zmnts" Apr 21 04:00:08.116066 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:08.116023 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aa4733fb-75e2-4c77-bd5d-7ea904966689-metrics-client-ca\") pod \"node-exporter-zmnts\" (UID: \"aa4733fb-75e2-4c77-bd5d-7ea904966689\") " pod="openshift-monitoring/node-exporter-zmnts" Apr 21 04:00:08.217456 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:08.217412 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/aa4733fb-75e2-4c77-bd5d-7ea904966689-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zmnts\" (UID: \"aa4733fb-75e2-4c77-bd5d-7ea904966689\") " pod="openshift-monitoring/node-exporter-zmnts" Apr 21 04:00:08.217647 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:08.217595 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/aa4733fb-75e2-4c77-bd5d-7ea904966689-root\") pod \"node-exporter-zmnts\" (UID: \"aa4733fb-75e2-4c77-bd5d-7ea904966689\") " pod="openshift-monitoring/node-exporter-zmnts" Apr 21 04:00:08.217647 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:08.217614 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/aa4733fb-75e2-4c77-bd5d-7ea904966689-node-exporter-accelerators-collector-config\") pod \"node-exporter-zmnts\" (UID: \"aa4733fb-75e2-4c77-bd5d-7ea904966689\") " pod="openshift-monitoring/node-exporter-zmnts" Apr 21 04:00:08.217647 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:08.217646 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/aa4733fb-75e2-4c77-bd5d-7ea904966689-node-exporter-tls\") pod \"node-exporter-zmnts\" (UID: \"aa4733fb-75e2-4c77-bd5d-7ea904966689\") " pod="openshift-monitoring/node-exporter-zmnts" Apr 21 04:00:08.217801 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:08.217671 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mqpmn\" (UniqueName: \"kubernetes.io/projected/aa4733fb-75e2-4c77-bd5d-7ea904966689-kube-api-access-mqpmn\") pod \"node-exporter-zmnts\" (UID: \"aa4733fb-75e2-4c77-bd5d-7ea904966689\") " pod="openshift-monitoring/node-exporter-zmnts" Apr 21 04:00:08.217801 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:08.217698 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/aa4733fb-75e2-4c77-bd5d-7ea904966689-sys\") pod \"node-exporter-zmnts\" (UID: \"aa4733fb-75e2-4c77-bd5d-7ea904966689\") " pod="openshift-monitoring/node-exporter-zmnts" Apr 21 04:00:08.217801 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:08.217715 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/aa4733fb-75e2-4c77-bd5d-7ea904966689-root\") pod \"node-exporter-zmnts\" (UID: \"aa4733fb-75e2-4c77-bd5d-7ea904966689\") " pod="openshift-monitoring/node-exporter-zmnts" Apr 21 04:00:08.217801 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:08.217723 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/aa4733fb-75e2-4c77-bd5d-7ea904966689-node-exporter-wtmp\") pod \"node-exporter-zmnts\" (UID: \"aa4733fb-75e2-4c77-bd5d-7ea904966689\") " pod="openshift-monitoring/node-exporter-zmnts" Apr 21 04:00:08.217801 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:08.217770 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aa4733fb-75e2-4c77-bd5d-7ea904966689-metrics-client-ca\") pod \"node-exporter-zmnts\" (UID: \"aa4733fb-75e2-4c77-bd5d-7ea904966689\") " pod="openshift-monitoring/node-exporter-zmnts" Apr 21 04:00:08.217801 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:08.217778 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/aa4733fb-75e2-4c77-bd5d-7ea904966689-sys\") pod \"node-exporter-zmnts\" (UID: \"aa4733fb-75e2-4c77-bd5d-7ea904966689\") " pod="openshift-monitoring/node-exporter-zmnts" Apr 21 04:00:08.218060 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:00:08.217799 2578 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 21 04:00:08.218060 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:08.217822 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/aa4733fb-75e2-4c77-bd5d-7ea904966689-node-exporter-textfile\") pod \"node-exporter-zmnts\" (UID: \"aa4733fb-75e2-4c77-bd5d-7ea904966689\") " pod="openshift-monitoring/node-exporter-zmnts" Apr 21 04:00:08.218060 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:00:08.217885 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa4733fb-75e2-4c77-bd5d-7ea904966689-node-exporter-tls podName:aa4733fb-75e2-4c77-bd5d-7ea904966689 nodeName:}" failed. No retries permitted until 2026-04-21 04:00:08.717863065 +0000 UTC m=+152.387302130 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/aa4733fb-75e2-4c77-bd5d-7ea904966689-node-exporter-tls") pod "node-exporter-zmnts" (UID: "aa4733fb-75e2-4c77-bd5d-7ea904966689") : secret "node-exporter-tls" not found Apr 21 04:00:08.218175 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:08.218066 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/aa4733fb-75e2-4c77-bd5d-7ea904966689-node-exporter-wtmp\") pod \"node-exporter-zmnts\" (UID: \"aa4733fb-75e2-4c77-bd5d-7ea904966689\") " pod="openshift-monitoring/node-exporter-zmnts" Apr 21 04:00:08.218175 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:08.218109 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/aa4733fb-75e2-4c77-bd5d-7ea904966689-node-exporter-textfile\") pod \"node-exporter-zmnts\" (UID: \"aa4733fb-75e2-4c77-bd5d-7ea904966689\") " pod="openshift-monitoring/node-exporter-zmnts" Apr 21 04:00:08.218400 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:08.218376 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/aa4733fb-75e2-4c77-bd5d-7ea904966689-node-exporter-accelerators-collector-config\") pod \"node-exporter-zmnts\" (UID: \"aa4733fb-75e2-4c77-bd5d-7ea904966689\") " pod="openshift-monitoring/node-exporter-zmnts" Apr 21 04:00:08.218463 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:08.218385 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aa4733fb-75e2-4c77-bd5d-7ea904966689-metrics-client-ca\") pod \"node-exporter-zmnts\" (UID: \"aa4733fb-75e2-4c77-bd5d-7ea904966689\") " pod="openshift-monitoring/node-exporter-zmnts" Apr 21 04:00:08.220308 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:08.220268 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/aa4733fb-75e2-4c77-bd5d-7ea904966689-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zmnts\" (UID: \"aa4733fb-75e2-4c77-bd5d-7ea904966689\") " pod="openshift-monitoring/node-exporter-zmnts" Apr 21 04:00:08.228976 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:08.228946 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqpmn\" (UniqueName: \"kubernetes.io/projected/aa4733fb-75e2-4c77-bd5d-7ea904966689-kube-api-access-mqpmn\") pod \"node-exporter-zmnts\" (UID: \"aa4733fb-75e2-4c77-bd5d-7ea904966689\") " pod="openshift-monitoring/node-exporter-zmnts" Apr 21 04:00:08.722030 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:08.721977 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/aa4733fb-75e2-4c77-bd5d-7ea904966689-node-exporter-tls\") pod \"node-exporter-zmnts\" (UID: \"aa4733fb-75e2-4c77-bd5d-7ea904966689\") " pod="openshift-monitoring/node-exporter-zmnts" Apr 21 04:00:08.722229 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:00:08.722142 2578 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 21 04:00:08.722320 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:00:08.722232 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa4733fb-75e2-4c77-bd5d-7ea904966689-node-exporter-tls podName:aa4733fb-75e2-4c77-bd5d-7ea904966689 nodeName:}" failed. No retries permitted until 2026-04-21 04:00:09.722213129 +0000 UTC m=+153.391652182 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/aa4733fb-75e2-4c77-bd5d-7ea904966689-node-exporter-tls") pod "node-exporter-zmnts" (UID: "aa4733fb-75e2-4c77-bd5d-7ea904966689") : secret "node-exporter-tls" not found Apr 21 04:00:09.143641 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.143602 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 04:00:09.163239 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.163207 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 04:00:09.163464 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.163435 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:09.166500 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.166112 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 21 04:00:09.166500 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.166174 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 21 04:00:09.166500 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.166175 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 21 04:00:09.166500 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.166305 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 21 04:00:09.166799 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.166576 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 21 04:00:09.166799 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.166691 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 21 04:00:09.166799 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.166704 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-9cdtv\"" Apr 21 04:00:09.166799 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.166689 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 21 04:00:09.166799 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.166774 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 21 04:00:09.167032 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.166843 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 21 04:00:09.227503 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.227272 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d40cad4f-6109-4381-b7f5-2a037fc43ff8-config-volume\") pod \"alertmanager-main-0\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:09.227503 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.227341 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d40cad4f-6109-4381-b7f5-2a037fc43ff8-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:09.227503 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.227373 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d40cad4f-6109-4381-b7f5-2a037fc43ff8-web-config\") pod \"alertmanager-main-0\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:09.227503 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.227411 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d40cad4f-6109-4381-b7f5-2a037fc43ff8-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:09.227503 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.227451 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d40cad4f-6109-4381-b7f5-2a037fc43ff8-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:09.227503 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.227478 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d40cad4f-6109-4381-b7f5-2a037fc43ff8-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:09.227503 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.227505 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d40cad4f-6109-4381-b7f5-2a037fc43ff8-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:09.227973 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.227570 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d40cad4f-6109-4381-b7f5-2a037fc43ff8-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:09.227973 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.227591 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d40cad4f-6109-4381-b7f5-2a037fc43ff8-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:09.227973 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.227685 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d40cad4f-6109-4381-b7f5-2a037fc43ff8-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:09.227973 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.227720 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntxnm\" (UniqueName: \"kubernetes.io/projected/d40cad4f-6109-4381-b7f5-2a037fc43ff8-kube-api-access-ntxnm\") pod \"alertmanager-main-0\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:09.227973 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.227748 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d40cad4f-6109-4381-b7f5-2a037fc43ff8-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:09.227973 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.227789 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d40cad4f-6109-4381-b7f5-2a037fc43ff8-config-out\") pod \"alertmanager-main-0\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:09.328652 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.328615 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d40cad4f-6109-4381-b7f5-2a037fc43ff8-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:09.328652 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.328654 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ntxnm\" (UniqueName: \"kubernetes.io/projected/d40cad4f-6109-4381-b7f5-2a037fc43ff8-kube-api-access-ntxnm\") pod \"alertmanager-main-0\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:09.328895 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.328682 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d40cad4f-6109-4381-b7f5-2a037fc43ff8-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:09.328895 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.328726 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d40cad4f-6109-4381-b7f5-2a037fc43ff8-config-out\") pod \"alertmanager-main-0\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:09.328895 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.328789 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d40cad4f-6109-4381-b7f5-2a037fc43ff8-config-volume\") pod \"alertmanager-main-0\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:09.328895 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.328812 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d40cad4f-6109-4381-b7f5-2a037fc43ff8-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:09.328895 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.328837 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d40cad4f-6109-4381-b7f5-2a037fc43ff8-web-config\") pod \"alertmanager-main-0\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:09.328895 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.328874 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d40cad4f-6109-4381-b7f5-2a037fc43ff8-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:09.329141 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.328915 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d40cad4f-6109-4381-b7f5-2a037fc43ff8-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:09.329141 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.328940 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d40cad4f-6109-4381-b7f5-2a037fc43ff8-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:09.329141 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.328962 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d40cad4f-6109-4381-b7f5-2a037fc43ff8-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:09.329141 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.329009 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d40cad4f-6109-4381-b7f5-2a037fc43ff8-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:09.329141 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.329032 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d40cad4f-6109-4381-b7f5-2a037fc43ff8-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:09.329141 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.329094 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d40cad4f-6109-4381-b7f5-2a037fc43ff8-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:09.329919 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:00:09.329514 2578 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 21 04:00:09.329919 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:00:09.329594 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d40cad4f-6109-4381-b7f5-2a037fc43ff8-secret-alertmanager-main-tls podName:d40cad4f-6109-4381-b7f5-2a037fc43ff8 nodeName:}" failed. No retries permitted until 2026-04-21 04:00:09.829572507 +0000 UTC m=+153.499011563 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/d40cad4f-6109-4381-b7f5-2a037fc43ff8-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "d40cad4f-6109-4381-b7f5-2a037fc43ff8") : secret "alertmanager-main-tls" not found Apr 21 04:00:09.329919 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.329784 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d40cad4f-6109-4381-b7f5-2a037fc43ff8-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:09.329919 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:00:09.329827 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d40cad4f-6109-4381-b7f5-2a037fc43ff8-alertmanager-trusted-ca-bundle podName:d40cad4f-6109-4381-b7f5-2a037fc43ff8 nodeName:}" failed. No retries permitted until 2026-04-21 04:00:09.829806555 +0000 UTC m=+153.499245621 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/d40cad4f-6109-4381-b7f5-2a037fc43ff8-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "d40cad4f-6109-4381-b7f5-2a037fc43ff8") : configmap references non-existent config key: ca-bundle.crt Apr 21 04:00:09.332269 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.332240 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d40cad4f-6109-4381-b7f5-2a037fc43ff8-config-out\") pod \"alertmanager-main-0\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:09.333085 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.333058 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d40cad4f-6109-4381-b7f5-2a037fc43ff8-web-config\") pod \"alertmanager-main-0\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:09.333615 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.333582 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d40cad4f-6109-4381-b7f5-2a037fc43ff8-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:09.333735 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.333710 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d40cad4f-6109-4381-b7f5-2a037fc43ff8-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:09.334313 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.334268 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d40cad4f-6109-4381-b7f5-2a037fc43ff8-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:09.334475 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.334451 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d40cad4f-6109-4381-b7f5-2a037fc43ff8-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:09.334977 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.334935 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d40cad4f-6109-4381-b7f5-2a037fc43ff8-config-volume\") pod \"alertmanager-main-0\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:09.335405 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.335382 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d40cad4f-6109-4381-b7f5-2a037fc43ff8-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:09.336797 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.336776 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntxnm\" (UniqueName: \"kubernetes.io/projected/d40cad4f-6109-4381-b7f5-2a037fc43ff8-kube-api-access-ntxnm\") pod \"alertmanager-main-0\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:09.732927 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.732890 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/aa4733fb-75e2-4c77-bd5d-7ea904966689-node-exporter-tls\") pod \"node-exporter-zmnts\" (UID: \"aa4733fb-75e2-4c77-bd5d-7ea904966689\") " pod="openshift-monitoring/node-exporter-zmnts" Apr 21 04:00:09.735420 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.735393 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/aa4733fb-75e2-4c77-bd5d-7ea904966689-node-exporter-tls\") pod \"node-exporter-zmnts\" (UID: \"aa4733fb-75e2-4c77-bd5d-7ea904966689\") " pod="openshift-monitoring/node-exporter-zmnts" Apr 21 04:00:09.833740 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.833698 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d40cad4f-6109-4381-b7f5-2a037fc43ff8-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:09.833924 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.833879 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d40cad4f-6109-4381-b7f5-2a037fc43ff8-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:09.835036 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.835009 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d40cad4f-6109-4381-b7f5-2a037fc43ff8-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:09.836519 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.836492 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d40cad4f-6109-4381-b7f5-2a037fc43ff8-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:09.871343 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:09.871310 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zmnts" Apr 21 04:00:10.076835 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:10.076789 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:10.433240 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:10.433160 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7bb9c48849-mjnfz"] Apr 21 04:00:10.435890 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:10.435866 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7bb9c48849-mjnfz" Apr 21 04:00:10.438411 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:10.438370 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 21 04:00:10.438550 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:10.438529 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 21 04:00:10.438624 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:10.438606 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 21 04:00:10.438682 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:10.438664 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 21 04:00:10.438786 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:10.438768 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 21 04:00:10.438931 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:10.438873 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-g7wts\"" Apr 21 04:00:10.450670 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:10.450392 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7bb9c48849-mjnfz"] Apr 21 04:00:10.452042 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:10.452019 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 21 04:00:10.541264 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:10.541221 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a2c56362-323c-41f4-a1ca-b564476fd1a1-console-oauth-config\") pod \"console-7bb9c48849-mjnfz\" (UID: \"a2c56362-323c-41f4-a1ca-b564476fd1a1\") " pod="openshift-console/console-7bb9c48849-mjnfz" Apr 21 04:00:10.541264 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:10.541269 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a2c56362-323c-41f4-a1ca-b564476fd1a1-oauth-serving-cert\") pod \"console-7bb9c48849-mjnfz\" (UID: \"a2c56362-323c-41f4-a1ca-b564476fd1a1\") " pod="openshift-console/console-7bb9c48849-mjnfz" Apr 21 04:00:10.541525 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:10.541383 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2c56362-323c-41f4-a1ca-b564476fd1a1-trusted-ca-bundle\") pod \"console-7bb9c48849-mjnfz\" (UID: \"a2c56362-323c-41f4-a1ca-b564476fd1a1\") " pod="openshift-console/console-7bb9c48849-mjnfz" Apr 21 04:00:10.541525 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:10.541421 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a2c56362-323c-41f4-a1ca-b564476fd1a1-console-config\") pod \"console-7bb9c48849-mjnfz\" (UID: \"a2c56362-323c-41f4-a1ca-b564476fd1a1\") " pod="openshift-console/console-7bb9c48849-mjnfz" Apr 21 04:00:10.541525 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:10.541501 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a2c56362-323c-41f4-a1ca-b564476fd1a1-console-serving-cert\") pod \"console-7bb9c48849-mjnfz\" (UID: \"a2c56362-323c-41f4-a1ca-b564476fd1a1\") " pod="openshift-console/console-7bb9c48849-mjnfz" Apr 21 04:00:10.541769 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:10.541576 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a2c56362-323c-41f4-a1ca-b564476fd1a1-service-ca\") pod \"console-7bb9c48849-mjnfz\" (UID: \"a2c56362-323c-41f4-a1ca-b564476fd1a1\") " pod="openshift-console/console-7bb9c48849-mjnfz" Apr 21 04:00:10.541769 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:10.541646 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td6mr\" (UniqueName: \"kubernetes.io/projected/a2c56362-323c-41f4-a1ca-b564476fd1a1-kube-api-access-td6mr\") pod \"console-7bb9c48849-mjnfz\" (UID: \"a2c56362-323c-41f4-a1ca-b564476fd1a1\") " pod="openshift-console/console-7bb9c48849-mjnfz" Apr 21 04:00:10.643003 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:10.642959 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a2c56362-323c-41f4-a1ca-b564476fd1a1-console-config\") pod \"console-7bb9c48849-mjnfz\" (UID: \"a2c56362-323c-41f4-a1ca-b564476fd1a1\") " pod="openshift-console/console-7bb9c48849-mjnfz" Apr 21 04:00:10.643196 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:10.643129 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a2c56362-323c-41f4-a1ca-b564476fd1a1-console-serving-cert\") pod \"console-7bb9c48849-mjnfz\" (UID: \"a2c56362-323c-41f4-a1ca-b564476fd1a1\") " pod="openshift-console/console-7bb9c48849-mjnfz" Apr 21 04:00:10.643196 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:10.643155 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a2c56362-323c-41f4-a1ca-b564476fd1a1-service-ca\") pod \"console-7bb9c48849-mjnfz\" (UID: \"a2c56362-323c-41f4-a1ca-b564476fd1a1\") " pod="openshift-console/console-7bb9c48849-mjnfz" Apr 21 04:00:10.643343 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:10.643214 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-td6mr\" (UniqueName: \"kubernetes.io/projected/a2c56362-323c-41f4-a1ca-b564476fd1a1-kube-api-access-td6mr\") pod \"console-7bb9c48849-mjnfz\" (UID: \"a2c56362-323c-41f4-a1ca-b564476fd1a1\") " pod="openshift-console/console-7bb9c48849-mjnfz" Apr 21 04:00:10.643343 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:10.643251 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a2c56362-323c-41f4-a1ca-b564476fd1a1-console-oauth-config\") pod \"console-7bb9c48849-mjnfz\" (UID: \"a2c56362-323c-41f4-a1ca-b564476fd1a1\") " pod="openshift-console/console-7bb9c48849-mjnfz" Apr 21 04:00:10.643343 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:10.643273 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a2c56362-323c-41f4-a1ca-b564476fd1a1-oauth-serving-cert\") pod \"console-7bb9c48849-mjnfz\" (UID: \"a2c56362-323c-41f4-a1ca-b564476fd1a1\") " pod="openshift-console/console-7bb9c48849-mjnfz" Apr 21 04:00:10.643542 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:10.643472 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2c56362-323c-41f4-a1ca-b564476fd1a1-trusted-ca-bundle\") pod \"console-7bb9c48849-mjnfz\" (UID: \"a2c56362-323c-41f4-a1ca-b564476fd1a1\") " pod="openshift-console/console-7bb9c48849-mjnfz" Apr 21 04:00:10.643888 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:10.643853 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a2c56362-323c-41f4-a1ca-b564476fd1a1-console-config\") pod \"console-7bb9c48849-mjnfz\" (UID: \"a2c56362-323c-41f4-a1ca-b564476fd1a1\") " pod="openshift-console/console-7bb9c48849-mjnfz" Apr 21 04:00:10.644063 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:10.644018 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a2c56362-323c-41f4-a1ca-b564476fd1a1-oauth-serving-cert\") pod \"console-7bb9c48849-mjnfz\" (UID: \"a2c56362-323c-41f4-a1ca-b564476fd1a1\") " pod="openshift-console/console-7bb9c48849-mjnfz" Apr 21 04:00:10.644168 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:10.644104 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a2c56362-323c-41f4-a1ca-b564476fd1a1-service-ca\") pod \"console-7bb9c48849-mjnfz\" (UID: \"a2c56362-323c-41f4-a1ca-b564476fd1a1\") " pod="openshift-console/console-7bb9c48849-mjnfz" Apr 21 04:00:10.644393 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:10.644367 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2c56362-323c-41f4-a1ca-b564476fd1a1-trusted-ca-bundle\") pod \"console-7bb9c48849-mjnfz\" (UID: \"a2c56362-323c-41f4-a1ca-b564476fd1a1\") " pod="openshift-console/console-7bb9c48849-mjnfz" Apr 21 04:00:10.646060 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:10.646036 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a2c56362-323c-41f4-a1ca-b564476fd1a1-console-serving-cert\") pod \"console-7bb9c48849-mjnfz\" (UID: \"a2c56362-323c-41f4-a1ca-b564476fd1a1\") " pod="openshift-console/console-7bb9c48849-mjnfz" Apr 21 04:00:10.646216 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:10.646191 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a2c56362-323c-41f4-a1ca-b564476fd1a1-console-oauth-config\") pod \"console-7bb9c48849-mjnfz\" (UID: \"a2c56362-323c-41f4-a1ca-b564476fd1a1\") " pod="openshift-console/console-7bb9c48849-mjnfz" Apr 21 04:00:10.651774 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:10.651718 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-td6mr\" (UniqueName: \"kubernetes.io/projected/a2c56362-323c-41f4-a1ca-b564476fd1a1-kube-api-access-td6mr\") pod \"console-7bb9c48849-mjnfz\" (UID: \"a2c56362-323c-41f4-a1ca-b564476fd1a1\") " pod="openshift-console/console-7bb9c48849-mjnfz" Apr 21 04:00:10.754005 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:10.753906 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7bb9c48849-mjnfz" Apr 21 04:00:11.033939 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:11.033902 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-6c69947b7-mcmmj"] Apr 21 04:00:11.037432 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:11.037399 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6c69947b7-mcmmj" Apr 21 04:00:11.039982 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:11.039956 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 21 04:00:11.040214 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:11.040191 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 21 04:00:11.040345 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:11.040215 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 21 04:00:11.040412 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:11.040338 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-jzj9f\"" Apr 21 04:00:11.040474 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:11.040457 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 21 04:00:11.040613 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:11.040591 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 21 04:00:11.040734 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:11.040711 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-6i5m2drrij3ff\"" Apr 21 04:00:11.047826 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:11.047804 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6c69947b7-mcmmj"] Apr 21 04:00:11.146964 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:11.146922 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b59fc3b6-a6a1-4455-b9b3-ae8335e2a143-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6c69947b7-mcmmj\" (UID: \"b59fc3b6-a6a1-4455-b9b3-ae8335e2a143\") " pod="openshift-monitoring/thanos-querier-6c69947b7-mcmmj" Apr 21 04:00:11.147138 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:11.146987 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b59fc3b6-a6a1-4455-b9b3-ae8335e2a143-metrics-client-ca\") pod \"thanos-querier-6c69947b7-mcmmj\" (UID: \"b59fc3b6-a6a1-4455-b9b3-ae8335e2a143\") " pod="openshift-monitoring/thanos-querier-6c69947b7-mcmmj" Apr 21 04:00:11.147202 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:11.147135 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b59fc3b6-a6a1-4455-b9b3-ae8335e2a143-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6c69947b7-mcmmj\" (UID: \"b59fc3b6-a6a1-4455-b9b3-ae8335e2a143\") " pod="openshift-monitoring/thanos-querier-6c69947b7-mcmmj" Apr 21 04:00:11.147202 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:11.147185 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/b59fc3b6-a6a1-4455-b9b3-ae8335e2a143-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6c69947b7-mcmmj\" (UID: \"b59fc3b6-a6a1-4455-b9b3-ae8335e2a143\") " pod="openshift-monitoring/thanos-querier-6c69947b7-mcmmj" Apr 21 04:00:11.147318 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:11.147218 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/b59fc3b6-a6a1-4455-b9b3-ae8335e2a143-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6c69947b7-mcmmj\" (UID: \"b59fc3b6-a6a1-4455-b9b3-ae8335e2a143\") " pod="openshift-monitoring/thanos-querier-6c69947b7-mcmmj" Apr 21 04:00:11.147318 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:11.147266 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b59fc3b6-a6a1-4455-b9b3-ae8335e2a143-secret-grpc-tls\") pod \"thanos-querier-6c69947b7-mcmmj\" (UID: \"b59fc3b6-a6a1-4455-b9b3-ae8335e2a143\") " pod="openshift-monitoring/thanos-querier-6c69947b7-mcmmj" Apr 21 04:00:11.147395 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:11.147315 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/b59fc3b6-a6a1-4455-b9b3-ae8335e2a143-secret-thanos-querier-tls\") pod \"thanos-querier-6c69947b7-mcmmj\" (UID: \"b59fc3b6-a6a1-4455-b9b3-ae8335e2a143\") " pod="openshift-monitoring/thanos-querier-6c69947b7-mcmmj" Apr 21 04:00:11.147395 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:11.147346 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p5xh\" (UniqueName: \"kubernetes.io/projected/b59fc3b6-a6a1-4455-b9b3-ae8335e2a143-kube-api-access-4p5xh\") pod \"thanos-querier-6c69947b7-mcmmj\" (UID: \"b59fc3b6-a6a1-4455-b9b3-ae8335e2a143\") " pod="openshift-monitoring/thanos-querier-6c69947b7-mcmmj" Apr 21 04:00:11.248902 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:11.248692 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b59fc3b6-a6a1-4455-b9b3-ae8335e2a143-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6c69947b7-mcmmj\" (UID: \"b59fc3b6-a6a1-4455-b9b3-ae8335e2a143\") " pod="openshift-monitoring/thanos-querier-6c69947b7-mcmmj" Apr 21 04:00:11.248902 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:11.248764 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b59fc3b6-a6a1-4455-b9b3-ae8335e2a143-metrics-client-ca\") pod \"thanos-querier-6c69947b7-mcmmj\" (UID: \"b59fc3b6-a6a1-4455-b9b3-ae8335e2a143\") " pod="openshift-monitoring/thanos-querier-6c69947b7-mcmmj" Apr 21 04:00:11.248902 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:11.248831 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b59fc3b6-a6a1-4455-b9b3-ae8335e2a143-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6c69947b7-mcmmj\" (UID: \"b59fc3b6-a6a1-4455-b9b3-ae8335e2a143\") " pod="openshift-monitoring/thanos-querier-6c69947b7-mcmmj" Apr 21 04:00:11.249781 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:11.249225 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/b59fc3b6-a6a1-4455-b9b3-ae8335e2a143-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6c69947b7-mcmmj\" (UID: \"b59fc3b6-a6a1-4455-b9b3-ae8335e2a143\") " pod="openshift-monitoring/thanos-querier-6c69947b7-mcmmj" Apr 21 04:00:11.249781 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:11.249308 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/b59fc3b6-a6a1-4455-b9b3-ae8335e2a143-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6c69947b7-mcmmj\" (UID: \"b59fc3b6-a6a1-4455-b9b3-ae8335e2a143\") " pod="openshift-monitoring/thanos-querier-6c69947b7-mcmmj" Apr 21 04:00:11.249781 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:11.249351 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b59fc3b6-a6a1-4455-b9b3-ae8335e2a143-secret-grpc-tls\") pod \"thanos-querier-6c69947b7-mcmmj\" (UID: \"b59fc3b6-a6a1-4455-b9b3-ae8335e2a143\") " pod="openshift-monitoring/thanos-querier-6c69947b7-mcmmj" Apr 21 04:00:11.249781 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:11.249387 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/b59fc3b6-a6a1-4455-b9b3-ae8335e2a143-secret-thanos-querier-tls\") pod \"thanos-querier-6c69947b7-mcmmj\" (UID: \"b59fc3b6-a6a1-4455-b9b3-ae8335e2a143\") " pod="openshift-monitoring/thanos-querier-6c69947b7-mcmmj" Apr 21 04:00:11.249781 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:11.249420 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4p5xh\" (UniqueName: \"kubernetes.io/projected/b59fc3b6-a6a1-4455-b9b3-ae8335e2a143-kube-api-access-4p5xh\") pod \"thanos-querier-6c69947b7-mcmmj\" (UID: \"b59fc3b6-a6a1-4455-b9b3-ae8335e2a143\") " pod="openshift-monitoring/thanos-querier-6c69947b7-mcmmj" Apr 21 04:00:11.249781 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:11.249775 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b59fc3b6-a6a1-4455-b9b3-ae8335e2a143-metrics-client-ca\") pod \"thanos-querier-6c69947b7-mcmmj\" (UID: \"b59fc3b6-a6a1-4455-b9b3-ae8335e2a143\") " pod="openshift-monitoring/thanos-querier-6c69947b7-mcmmj" Apr 21 04:00:11.252251 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:11.252215 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b59fc3b6-a6a1-4455-b9b3-ae8335e2a143-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6c69947b7-mcmmj\" (UID: \"b59fc3b6-a6a1-4455-b9b3-ae8335e2a143\") " pod="openshift-monitoring/thanos-querier-6c69947b7-mcmmj" Apr 21 04:00:11.252739 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:11.252713 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b59fc3b6-a6a1-4455-b9b3-ae8335e2a143-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6c69947b7-mcmmj\" (UID: \"b59fc3b6-a6a1-4455-b9b3-ae8335e2a143\") " pod="openshift-monitoring/thanos-querier-6c69947b7-mcmmj" Apr 21 04:00:11.252739 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:11.252737 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b59fc3b6-a6a1-4455-b9b3-ae8335e2a143-secret-grpc-tls\") pod \"thanos-querier-6c69947b7-mcmmj\" (UID: \"b59fc3b6-a6a1-4455-b9b3-ae8335e2a143\") " pod="openshift-monitoring/thanos-querier-6c69947b7-mcmmj" Apr 21 04:00:11.253070 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:11.253048 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/b59fc3b6-a6a1-4455-b9b3-ae8335e2a143-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6c69947b7-mcmmj\" (UID: \"b59fc3b6-a6a1-4455-b9b3-ae8335e2a143\") " pod="openshift-monitoring/thanos-querier-6c69947b7-mcmmj" Apr 21 04:00:11.253174 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:11.253154 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/b59fc3b6-a6a1-4455-b9b3-ae8335e2a143-secret-thanos-querier-tls\") pod \"thanos-querier-6c69947b7-mcmmj\" (UID: \"b59fc3b6-a6a1-4455-b9b3-ae8335e2a143\") " pod="openshift-monitoring/thanos-querier-6c69947b7-mcmmj" Apr 21 04:00:11.253815 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:11.253793 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/b59fc3b6-a6a1-4455-b9b3-ae8335e2a143-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6c69947b7-mcmmj\" (UID: \"b59fc3b6-a6a1-4455-b9b3-ae8335e2a143\") " pod="openshift-monitoring/thanos-querier-6c69947b7-mcmmj" Apr 21 04:00:11.256800 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:11.256774 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p5xh\" (UniqueName: \"kubernetes.io/projected/b59fc3b6-a6a1-4455-b9b3-ae8335e2a143-kube-api-access-4p5xh\") pod \"thanos-querier-6c69947b7-mcmmj\" (UID: \"b59fc3b6-a6a1-4455-b9b3-ae8335e2a143\") " pod="openshift-monitoring/thanos-querier-6c69947b7-mcmmj" Apr 21 04:00:11.349659 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:11.349568 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6c69947b7-mcmmj" Apr 21 04:00:11.747407 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:00:11.747311 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-lk9pj" podUID="5b9715dc-7dd5-46ea-961a-2f02107a7655" Apr 21 04:00:11.775566 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:00:11.775525 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-sjcwm" podUID="e9350a7b-5fe0-4e30-8c05-5ee260472029" Apr 21 04:00:12.408184 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:12.408143 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-sjcwm" Apr 21 04:00:12.408380 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:12.408143 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lk9pj" Apr 21 04:00:12.436306 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:12.436265 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-fd55fd488-h8sm5"] Apr 21 04:00:12.440989 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:12.440956 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-fd55fd488-h8sm5" Apr 21 04:00:12.443723 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:12.443690 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 21 04:00:12.443723 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:12.443712 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 21 04:00:12.443918 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:12.443779 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 21 04:00:12.444814 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:12.444749 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 21 04:00:12.445136 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:12.445032 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-df9cauh1mh6v6\"" Apr 21 04:00:12.445365 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:12.445347 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-nwvvb\"" Apr 21 04:00:12.447875 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:12.447843 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-fd55fd488-h8sm5"] Apr 21 04:00:12.561447 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:12.561409 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/972cb6a5-a2e1-4220-b72b-f7d3cf55ce7a-audit-log\") pod \"metrics-server-fd55fd488-h8sm5\" (UID: \"972cb6a5-a2e1-4220-b72b-f7d3cf55ce7a\") " pod="openshift-monitoring/metrics-server-fd55fd488-h8sm5" Apr 21 04:00:12.561623 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:12.561476 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/972cb6a5-a2e1-4220-b72b-f7d3cf55ce7a-client-ca-bundle\") pod \"metrics-server-fd55fd488-h8sm5\" (UID: \"972cb6a5-a2e1-4220-b72b-f7d3cf55ce7a\") " pod="openshift-monitoring/metrics-server-fd55fd488-h8sm5" Apr 21 04:00:12.561623 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:12.561508 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w64vd\" (UniqueName: \"kubernetes.io/projected/972cb6a5-a2e1-4220-b72b-f7d3cf55ce7a-kube-api-access-w64vd\") pod \"metrics-server-fd55fd488-h8sm5\" (UID: \"972cb6a5-a2e1-4220-b72b-f7d3cf55ce7a\") " pod="openshift-monitoring/metrics-server-fd55fd488-h8sm5" Apr 21 04:00:12.561623 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:12.561562 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/972cb6a5-a2e1-4220-b72b-f7d3cf55ce7a-secret-metrics-server-tls\") pod \"metrics-server-fd55fd488-h8sm5\" (UID: \"972cb6a5-a2e1-4220-b72b-f7d3cf55ce7a\") " pod="openshift-monitoring/metrics-server-fd55fd488-h8sm5" Apr 21 04:00:12.561623 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:12.561611 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/972cb6a5-a2e1-4220-b72b-f7d3cf55ce7a-metrics-server-audit-profiles\") pod \"metrics-server-fd55fd488-h8sm5\" (UID: \"972cb6a5-a2e1-4220-b72b-f7d3cf55ce7a\") " pod="openshift-monitoring/metrics-server-fd55fd488-h8sm5" Apr 21 04:00:12.561823 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:12.561699 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/972cb6a5-a2e1-4220-b72b-f7d3cf55ce7a-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-fd55fd488-h8sm5\" (UID: \"972cb6a5-a2e1-4220-b72b-f7d3cf55ce7a\") " pod="openshift-monitoring/metrics-server-fd55fd488-h8sm5" Apr 21 04:00:12.561823 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:12.561802 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/972cb6a5-a2e1-4220-b72b-f7d3cf55ce7a-secret-metrics-server-client-certs\") pod \"metrics-server-fd55fd488-h8sm5\" (UID: \"972cb6a5-a2e1-4220-b72b-f7d3cf55ce7a\") " pod="openshift-monitoring/metrics-server-fd55fd488-h8sm5" Apr 21 04:00:12.662650 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:12.662555 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/972cb6a5-a2e1-4220-b72b-f7d3cf55ce7a-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-fd55fd488-h8sm5\" (UID: \"972cb6a5-a2e1-4220-b72b-f7d3cf55ce7a\") " pod="openshift-monitoring/metrics-server-fd55fd488-h8sm5" Apr 21 04:00:12.662815 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:12.662633 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/972cb6a5-a2e1-4220-b72b-f7d3cf55ce7a-secret-metrics-server-client-certs\") pod \"metrics-server-fd55fd488-h8sm5\" (UID: \"972cb6a5-a2e1-4220-b72b-f7d3cf55ce7a\") " pod="openshift-monitoring/metrics-server-fd55fd488-h8sm5" Apr 21 04:00:12.662815 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:12.662687 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/972cb6a5-a2e1-4220-b72b-f7d3cf55ce7a-audit-log\") pod \"metrics-server-fd55fd488-h8sm5\" (UID: \"972cb6a5-a2e1-4220-b72b-f7d3cf55ce7a\") " pod="openshift-monitoring/metrics-server-fd55fd488-h8sm5" Apr 21 04:00:12.662815 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:12.662724 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/972cb6a5-a2e1-4220-b72b-f7d3cf55ce7a-client-ca-bundle\") pod \"metrics-server-fd55fd488-h8sm5\" (UID: \"972cb6a5-a2e1-4220-b72b-f7d3cf55ce7a\") " pod="openshift-monitoring/metrics-server-fd55fd488-h8sm5" Apr 21 04:00:12.662815 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:12.662748 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w64vd\" (UniqueName: \"kubernetes.io/projected/972cb6a5-a2e1-4220-b72b-f7d3cf55ce7a-kube-api-access-w64vd\") pod \"metrics-server-fd55fd488-h8sm5\" (UID: \"972cb6a5-a2e1-4220-b72b-f7d3cf55ce7a\") " pod="openshift-monitoring/metrics-server-fd55fd488-h8sm5" Apr 21 04:00:12.662815 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:12.662797 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/972cb6a5-a2e1-4220-b72b-f7d3cf55ce7a-secret-metrics-server-tls\") pod \"metrics-server-fd55fd488-h8sm5\" (UID: \"972cb6a5-a2e1-4220-b72b-f7d3cf55ce7a\") " pod="openshift-monitoring/metrics-server-fd55fd488-h8sm5" Apr 21 04:00:12.663067 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:12.662850 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/972cb6a5-a2e1-4220-b72b-f7d3cf55ce7a-metrics-server-audit-profiles\") pod \"metrics-server-fd55fd488-h8sm5\" (UID: \"972cb6a5-a2e1-4220-b72b-f7d3cf55ce7a\") " pod="openshift-monitoring/metrics-server-fd55fd488-h8sm5" Apr 21 04:00:12.663184 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:12.663158 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/972cb6a5-a2e1-4220-b72b-f7d3cf55ce7a-audit-log\") pod \"metrics-server-fd55fd488-h8sm5\" (UID: \"972cb6a5-a2e1-4220-b72b-f7d3cf55ce7a\") " pod="openshift-monitoring/metrics-server-fd55fd488-h8sm5" Apr 21 04:00:12.663450 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:12.663407 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/972cb6a5-a2e1-4220-b72b-f7d3cf55ce7a-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-fd55fd488-h8sm5\" (UID: \"972cb6a5-a2e1-4220-b72b-f7d3cf55ce7a\") " pod="openshift-monitoring/metrics-server-fd55fd488-h8sm5" Apr 21 04:00:12.663861 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:12.663840 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/972cb6a5-a2e1-4220-b72b-f7d3cf55ce7a-metrics-server-audit-profiles\") pod \"metrics-server-fd55fd488-h8sm5\" (UID: \"972cb6a5-a2e1-4220-b72b-f7d3cf55ce7a\") " pod="openshift-monitoring/metrics-server-fd55fd488-h8sm5" Apr 21 04:00:12.665652 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:12.665625 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/972cb6a5-a2e1-4220-b72b-f7d3cf55ce7a-client-ca-bundle\") pod \"metrics-server-fd55fd488-h8sm5\" (UID: \"972cb6a5-a2e1-4220-b72b-f7d3cf55ce7a\") " pod="openshift-monitoring/metrics-server-fd55fd488-h8sm5" Apr 21 04:00:12.665652 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:12.665638 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/972cb6a5-a2e1-4220-b72b-f7d3cf55ce7a-secret-metrics-server-client-certs\") pod \"metrics-server-fd55fd488-h8sm5\" (UID: \"972cb6a5-a2e1-4220-b72b-f7d3cf55ce7a\") " pod="openshift-monitoring/metrics-server-fd55fd488-h8sm5" Apr 21 04:00:12.665823 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:12.665657 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/972cb6a5-a2e1-4220-b72b-f7d3cf55ce7a-secret-metrics-server-tls\") pod \"metrics-server-fd55fd488-h8sm5\" (UID: \"972cb6a5-a2e1-4220-b72b-f7d3cf55ce7a\") " pod="openshift-monitoring/metrics-server-fd55fd488-h8sm5" Apr 21 04:00:12.670538 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:12.670515 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w64vd\" (UniqueName: \"kubernetes.io/projected/972cb6a5-a2e1-4220-b72b-f7d3cf55ce7a-kube-api-access-w64vd\") pod \"metrics-server-fd55fd488-h8sm5\" (UID: \"972cb6a5-a2e1-4220-b72b-f7d3cf55ce7a\") " pod="openshift-monitoring/metrics-server-fd55fd488-h8sm5" Apr 21 04:00:12.752939 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:12.752887 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-fd55fd488-h8sm5" Apr 21 04:00:12.819257 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:12.819197 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-r88vj"] Apr 21 04:00:12.823368 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:12.823345 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r88vj" Apr 21 04:00:12.826071 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:12.826044 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 21 04:00:12.826362 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:12.826346 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-msbms\"" Apr 21 04:00:12.831863 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:12.831583 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-r88vj"] Apr 21 04:00:12.865456 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:12.865426 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/976a9808-8386-44d6-964f-4f35e8b7bf8f-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-r88vj\" (UID: \"976a9808-8386-44d6-964f-4f35e8b7bf8f\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r88vj" Apr 21 04:00:12.942738 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:00:12.942646 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-2lrq9" podUID="6e98b17f-4794-44aa-8756-58a9bd9cb37a" Apr 21 04:00:12.966090 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:12.966055 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/976a9808-8386-44d6-964f-4f35e8b7bf8f-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-r88vj\" (UID: \"976a9808-8386-44d6-964f-4f35e8b7bf8f\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r88vj" Apr 21 04:00:12.966258 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:00:12.966234 2578 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 21 04:00:12.966370 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:00:12.966349 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/976a9808-8386-44d6-964f-4f35e8b7bf8f-monitoring-plugin-cert podName:976a9808-8386-44d6-964f-4f35e8b7bf8f nodeName:}" failed. No retries permitted until 2026-04-21 04:00:13.466320115 +0000 UTC m=+157.135759176 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/976a9808-8386-44d6-964f-4f35e8b7bf8f-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-r88vj" (UID: "976a9808-8386-44d6-964f-4f35e8b7bf8f") : secret "monitoring-plugin-cert" not found Apr 21 04:00:13.472095 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:13.472051 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/976a9808-8386-44d6-964f-4f35e8b7bf8f-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-r88vj\" (UID: \"976a9808-8386-44d6-964f-4f35e8b7bf8f\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r88vj" Apr 21 04:00:13.472300 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:00:13.472202 2578 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 21 04:00:13.472361 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:00:13.472304 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/976a9808-8386-44d6-964f-4f35e8b7bf8f-monitoring-plugin-cert podName:976a9808-8386-44d6-964f-4f35e8b7bf8f nodeName:}" failed. No retries permitted until 2026-04-21 04:00:14.472267818 +0000 UTC m=+158.141706893 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/976a9808-8386-44d6-964f-4f35e8b7bf8f-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-r88vj" (UID: "976a9808-8386-44d6-964f-4f35e8b7bf8f") : secret "monitoring-plugin-cert" not found Apr 21 04:00:14.482557 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:14.482508 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/976a9808-8386-44d6-964f-4f35e8b7bf8f-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-r88vj\" (UID: \"976a9808-8386-44d6-964f-4f35e8b7bf8f\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r88vj" Apr 21 04:00:14.486868 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:14.486833 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/976a9808-8386-44d6-964f-4f35e8b7bf8f-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-r88vj\" (UID: \"976a9808-8386-44d6-964f-4f35e8b7bf8f\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r88vj" Apr 21 04:00:14.635203 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:14.635166 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r88vj" Apr 21 04:00:16.601525 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:16.601482 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b9715dc-7dd5-46ea-961a-2f02107a7655-metrics-tls\") pod \"dns-default-lk9pj\" (UID: \"5b9715dc-7dd5-46ea-961a-2f02107a7655\") " pod="openshift-dns/dns-default-lk9pj" Apr 21 04:00:16.601525 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:16.601526 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9350a7b-5fe0-4e30-8c05-5ee260472029-cert\") pod \"ingress-canary-sjcwm\" (UID: \"e9350a7b-5fe0-4e30-8c05-5ee260472029\") " pod="openshift-ingress-canary/ingress-canary-sjcwm" Apr 21 04:00:16.604122 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:16.604088 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b9715dc-7dd5-46ea-961a-2f02107a7655-metrics-tls\") pod \"dns-default-lk9pj\" (UID: \"5b9715dc-7dd5-46ea-961a-2f02107a7655\") " pod="openshift-dns/dns-default-lk9pj" Apr 21 04:00:16.604261 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:16.604147 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9350a7b-5fe0-4e30-8c05-5ee260472029-cert\") pod \"ingress-canary-sjcwm\" (UID: \"e9350a7b-5fe0-4e30-8c05-5ee260472029\") " pod="openshift-ingress-canary/ingress-canary-sjcwm" Apr 21 04:00:16.611916 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:16.611855 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-7mknc\"" Apr 21 04:00:16.612953 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:16.612929 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-7tx7c\"" Apr 21 04:00:16.619182 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:16.619156 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lk9pj" Apr 21 04:00:16.619338 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:16.619241 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-sjcwm" Apr 21 04:00:17.090919 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:17.090856 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-sjcwm"] Apr 21 04:00:17.092760 ip-10-0-134-136 kubenswrapper[2578]: W0421 04:00:17.092719 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9350a7b_5fe0_4e30_8c05_5ee260472029.slice/crio-b02ed04ef9374ceac7327a27439654c31ea9f324f25aa706bfa41d8c53b097fc WatchSource:0}: Error finding container b02ed04ef9374ceac7327a27439654c31ea9f324f25aa706bfa41d8c53b097fc: Status 404 returned error can't find the container with id b02ed04ef9374ceac7327a27439654c31ea9f324f25aa706bfa41d8c53b097fc Apr 21 04:00:17.116324 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:17.116298 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lk9pj"] Apr 21 04:00:17.118772 ip-10-0-134-136 kubenswrapper[2578]: W0421 04:00:17.118739 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b9715dc_7dd5_46ea_961a_2f02107a7655.slice/crio-4cba560ea58d6effce201c1427c26bda2aa34ded044e5d476a5a29b5968a404e WatchSource:0}: Error finding container 4cba560ea58d6effce201c1427c26bda2aa34ded044e5d476a5a29b5968a404e: Status 404 returned error can't find the container with id 4cba560ea58d6effce201c1427c26bda2aa34ded044e5d476a5a29b5968a404e Apr 21 04:00:17.337600 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:17.337473 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-r88vj"] Apr 21 04:00:17.341267 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:17.341243 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7bb9c48849-mjnfz"] Apr 21 04:00:17.342183 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:17.342122 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-fd55fd488-h8sm5"] Apr 21 04:00:17.345577 ip-10-0-134-136 kubenswrapper[2578]: W0421 04:00:17.345405 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod972cb6a5_a2e1_4220_b72b_f7d3cf55ce7a.slice/crio-e52d28c37871a7bfd615e31e5b352e27c38ebf0779a35f670c45f1d94ff5d9c0 WatchSource:0}: Error finding container e52d28c37871a7bfd615e31e5b352e27c38ebf0779a35f670c45f1d94ff5d9c0: Status 404 returned error can't find the container with id e52d28c37871a7bfd615e31e5b352e27c38ebf0779a35f670c45f1d94ff5d9c0 Apr 21 04:00:17.347033 ip-10-0-134-136 kubenswrapper[2578]: W0421 04:00:17.346977 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2c56362_323c_41f4_a1ca_b564476fd1a1.slice/crio-1512ddb5dc9a701a9714a791675a6267f47d50c9bb7a4f884adf2abc49309f71 WatchSource:0}: Error finding container 1512ddb5dc9a701a9714a791675a6267f47d50c9bb7a4f884adf2abc49309f71: Status 404 returned error can't find the container with id 1512ddb5dc9a701a9714a791675a6267f47d50c9bb7a4f884adf2abc49309f71 Apr 21 04:00:17.387937 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:17.387903 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6c69947b7-mcmmj"] Apr 21 04:00:17.391008 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:17.390894 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 04:00:17.393965 ip-10-0-134-136 kubenswrapper[2578]: W0421 04:00:17.393936 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb59fc3b6_a6a1_4455_b9b3_ae8335e2a143.slice/crio-64ef38a6d9a81dfdf313f3876dadd5db0db1cffbb93238a96987124504e12bf0 WatchSource:0}: Error finding container 64ef38a6d9a81dfdf313f3876dadd5db0db1cffbb93238a96987124504e12bf0: Status 404 returned error can't find the container with id 64ef38a6d9a81dfdf313f3876dadd5db0db1cffbb93238a96987124504e12bf0 Apr 21 04:00:17.394875 ip-10-0-134-136 kubenswrapper[2578]: W0421 04:00:17.394850 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd40cad4f_6109_4381_b7f5_2a037fc43ff8.slice/crio-f34dce73f172e2371ae7e0c77bce858ffaa24f6c443b398f43dfc2f5a954246f WatchSource:0}: Error finding container f34dce73f172e2371ae7e0c77bce858ffaa24f6c443b398f43dfc2f5a954246f: Status 404 returned error can't find the container with id f34dce73f172e2371ae7e0c77bce858ffaa24f6c443b398f43dfc2f5a954246f Apr 21 04:00:17.422817 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:17.422780 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-sjcwm" event={"ID":"e9350a7b-5fe0-4e30-8c05-5ee260472029","Type":"ContainerStarted","Data":"b02ed04ef9374ceac7327a27439654c31ea9f324f25aa706bfa41d8c53b097fc"} Apr 21 04:00:17.424602 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:17.424563 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7bb9c48849-mjnfz" event={"ID":"a2c56362-323c-41f4-a1ca-b564476fd1a1","Type":"ContainerStarted","Data":"1512ddb5dc9a701a9714a791675a6267f47d50c9bb7a4f884adf2abc49309f71"} Apr 21 04:00:17.425924 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:17.425854 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-fd55fd488-h8sm5" event={"ID":"972cb6a5-a2e1-4220-b72b-f7d3cf55ce7a","Type":"ContainerStarted","Data":"e52d28c37871a7bfd615e31e5b352e27c38ebf0779a35f670c45f1d94ff5d9c0"} Apr 21 04:00:17.427119 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:17.427077 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d40cad4f-6109-4381-b7f5-2a037fc43ff8","Type":"ContainerStarted","Data":"f34dce73f172e2371ae7e0c77bce858ffaa24f6c443b398f43dfc2f5a954246f"} Apr 21 04:00:17.429117 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:17.429082 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-qf2jm" event={"ID":"07645b09-21c7-466b-a46d-b48a72d9c654","Type":"ContainerStarted","Data":"8dc1d658f842913421a00adf3ae29acc806e2cc46faa36b739c54fca14ca79e4"} Apr 21 04:00:17.429366 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:17.429349 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-qf2jm" Apr 21 04:00:17.430471 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:17.430445 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6c69947b7-mcmmj" event={"ID":"b59fc3b6-a6a1-4455-b9b3-ae8335e2a143","Type":"ContainerStarted","Data":"64ef38a6d9a81dfdf313f3876dadd5db0db1cffbb93238a96987124504e12bf0"} Apr 21 04:00:17.432571 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:17.432546 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lk9pj" event={"ID":"5b9715dc-7dd5-46ea-961a-2f02107a7655","Type":"ContainerStarted","Data":"4cba560ea58d6effce201c1427c26bda2aa34ded044e5d476a5a29b5968a404e"} Apr 21 04:00:17.433822 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:17.433801 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r88vj" event={"ID":"976a9808-8386-44d6-964f-4f35e8b7bf8f","Type":"ContainerStarted","Data":"cae32ba30bf33cfd5412271574d3ddf5f58e92b8ca4b1a5f8889d70810f1697e"} Apr 21 04:00:17.435062 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:17.435037 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zmnts" event={"ID":"aa4733fb-75e2-4c77-bd5d-7ea904966689","Type":"ContainerStarted","Data":"fdf5199a50d91d1a41f398ac0c395602f30d8202cda35c97045c362e70c06011"} Apr 21 04:00:17.445991 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:17.445917 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-qf2jm" Apr 21 04:00:17.448042 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:17.447996 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-qf2jm" podStartSLOduration=1.252495366 podStartE2EDuration="17.447979929s" podCreationTimestamp="2026-04-21 04:00:00 +0000 UTC" firstStartedPulling="2026-04-21 04:00:00.77092631 +0000 UTC m=+144.440365364" lastFinishedPulling="2026-04-21 04:00:16.96641087 +0000 UTC m=+160.635849927" observedRunningTime="2026-04-21 04:00:17.446175902 +0000 UTC m=+161.115615015" watchObservedRunningTime="2026-04-21 04:00:17.447979929 +0000 UTC m=+161.117419006" Apr 21 04:00:18.272987 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:18.270959 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-cdfc4cb68-9k284"] Apr 21 04:00:18.276311 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:18.276119 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cdfc4cb68-9k284" Apr 21 04:00:18.294817 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:18.294762 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-cdfc4cb68-9k284"] Apr 21 04:00:18.434945 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:18.434859 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0542ee55-ebd2-4c1f-888f-a044bc8cc7dc-service-ca\") pod \"console-cdfc4cb68-9k284\" (UID: \"0542ee55-ebd2-4c1f-888f-a044bc8cc7dc\") " pod="openshift-console/console-cdfc4cb68-9k284" Apr 21 04:00:18.434945 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:18.434930 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0542ee55-ebd2-4c1f-888f-a044bc8cc7dc-trusted-ca-bundle\") pod \"console-cdfc4cb68-9k284\" (UID: \"0542ee55-ebd2-4c1f-888f-a044bc8cc7dc\") " pod="openshift-console/console-cdfc4cb68-9k284" Apr 21 04:00:18.435202 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:18.435029 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0542ee55-ebd2-4c1f-888f-a044bc8cc7dc-console-serving-cert\") pod \"console-cdfc4cb68-9k284\" (UID: \"0542ee55-ebd2-4c1f-888f-a044bc8cc7dc\") " pod="openshift-console/console-cdfc4cb68-9k284" Apr 21 04:00:18.435202 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:18.435060 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f9sp\" (UniqueName: \"kubernetes.io/projected/0542ee55-ebd2-4c1f-888f-a044bc8cc7dc-kube-api-access-4f9sp\") pod \"console-cdfc4cb68-9k284\" (UID: \"0542ee55-ebd2-4c1f-888f-a044bc8cc7dc\") " pod="openshift-console/console-cdfc4cb68-9k284" Apr 21 04:00:18.435202 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:18.435141 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0542ee55-ebd2-4c1f-888f-a044bc8cc7dc-console-oauth-config\") pod \"console-cdfc4cb68-9k284\" (UID: \"0542ee55-ebd2-4c1f-888f-a044bc8cc7dc\") " pod="openshift-console/console-cdfc4cb68-9k284" Apr 21 04:00:18.435202 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:18.435166 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0542ee55-ebd2-4c1f-888f-a044bc8cc7dc-console-config\") pod \"console-cdfc4cb68-9k284\" (UID: \"0542ee55-ebd2-4c1f-888f-a044bc8cc7dc\") " pod="openshift-console/console-cdfc4cb68-9k284" Apr 21 04:00:18.435202 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:18.435190 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0542ee55-ebd2-4c1f-888f-a044bc8cc7dc-oauth-serving-cert\") pod \"console-cdfc4cb68-9k284\" (UID: \"0542ee55-ebd2-4c1f-888f-a044bc8cc7dc\") " pod="openshift-console/console-cdfc4cb68-9k284" Apr 21 04:00:18.446972 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:18.446932 2578 generic.go:358] "Generic (PLEG): container finished" podID="aa4733fb-75e2-4c77-bd5d-7ea904966689" containerID="7563c7456e862534ffd0407c25191484e84220c3f7d33951e6d302da17bff20b" exitCode=0 Apr 21 04:00:18.448412 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:18.448112 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zmnts" event={"ID":"aa4733fb-75e2-4c77-bd5d-7ea904966689","Type":"ContainerDied","Data":"7563c7456e862534ffd0407c25191484e84220c3f7d33951e6d302da17bff20b"} Apr 21 04:00:18.536404 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:18.536356 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0542ee55-ebd2-4c1f-888f-a044bc8cc7dc-console-oauth-config\") pod \"console-cdfc4cb68-9k284\" (UID: \"0542ee55-ebd2-4c1f-888f-a044bc8cc7dc\") " pod="openshift-console/console-cdfc4cb68-9k284" Apr 21 04:00:18.536404 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:18.536405 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0542ee55-ebd2-4c1f-888f-a044bc8cc7dc-console-config\") pod \"console-cdfc4cb68-9k284\" (UID: \"0542ee55-ebd2-4c1f-888f-a044bc8cc7dc\") " pod="openshift-console/console-cdfc4cb68-9k284" Apr 21 04:00:18.536404 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:18.536430 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0542ee55-ebd2-4c1f-888f-a044bc8cc7dc-oauth-serving-cert\") pod \"console-cdfc4cb68-9k284\" (UID: \"0542ee55-ebd2-4c1f-888f-a044bc8cc7dc\") " pod="openshift-console/console-cdfc4cb68-9k284" Apr 21 04:00:18.536404 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:18.536471 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0542ee55-ebd2-4c1f-888f-a044bc8cc7dc-service-ca\") pod \"console-cdfc4cb68-9k284\" (UID: \"0542ee55-ebd2-4c1f-888f-a044bc8cc7dc\") " pod="openshift-console/console-cdfc4cb68-9k284" Apr 21 04:00:18.539428 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:18.536526 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0542ee55-ebd2-4c1f-888f-a044bc8cc7dc-trusted-ca-bundle\") pod \"console-cdfc4cb68-9k284\" (UID: \"0542ee55-ebd2-4c1f-888f-a044bc8cc7dc\") " pod="openshift-console/console-cdfc4cb68-9k284" Apr 21 04:00:18.539428 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:18.536655 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0542ee55-ebd2-4c1f-888f-a044bc8cc7dc-console-serving-cert\") pod \"console-cdfc4cb68-9k284\" (UID: \"0542ee55-ebd2-4c1f-888f-a044bc8cc7dc\") " pod="openshift-console/console-cdfc4cb68-9k284" Apr 21 04:00:18.539428 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:18.536701 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4f9sp\" (UniqueName: \"kubernetes.io/projected/0542ee55-ebd2-4c1f-888f-a044bc8cc7dc-kube-api-access-4f9sp\") pod \"console-cdfc4cb68-9k284\" (UID: \"0542ee55-ebd2-4c1f-888f-a044bc8cc7dc\") " pod="openshift-console/console-cdfc4cb68-9k284" Apr 21 04:00:18.539428 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:18.538429 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0542ee55-ebd2-4c1f-888f-a044bc8cc7dc-trusted-ca-bundle\") pod \"console-cdfc4cb68-9k284\" (UID: \"0542ee55-ebd2-4c1f-888f-a044bc8cc7dc\") " pod="openshift-console/console-cdfc4cb68-9k284" Apr 21 04:00:18.539428 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:18.538957 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0542ee55-ebd2-4c1f-888f-a044bc8cc7dc-oauth-serving-cert\") pod \"console-cdfc4cb68-9k284\" (UID: \"0542ee55-ebd2-4c1f-888f-a044bc8cc7dc\") " pod="openshift-console/console-cdfc4cb68-9k284" Apr 21 04:00:18.540180 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:18.540004 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0542ee55-ebd2-4c1f-888f-a044bc8cc7dc-service-ca\") pod \"console-cdfc4cb68-9k284\" (UID: \"0542ee55-ebd2-4c1f-888f-a044bc8cc7dc\") " pod="openshift-console/console-cdfc4cb68-9k284" Apr 21 04:00:18.540693 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:18.540652 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0542ee55-ebd2-4c1f-888f-a044bc8cc7dc-console-config\") pod \"console-cdfc4cb68-9k284\" (UID: \"0542ee55-ebd2-4c1f-888f-a044bc8cc7dc\") " pod="openshift-console/console-cdfc4cb68-9k284" Apr 21 04:00:18.548873 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:18.548837 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f9sp\" (UniqueName: \"kubernetes.io/projected/0542ee55-ebd2-4c1f-888f-a044bc8cc7dc-kube-api-access-4f9sp\") pod \"console-cdfc4cb68-9k284\" (UID: \"0542ee55-ebd2-4c1f-888f-a044bc8cc7dc\") " pod="openshift-console/console-cdfc4cb68-9k284" Apr 21 04:00:18.559306 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:18.559236 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0542ee55-ebd2-4c1f-888f-a044bc8cc7dc-console-oauth-config\") pod \"console-cdfc4cb68-9k284\" (UID: \"0542ee55-ebd2-4c1f-888f-a044bc8cc7dc\") " pod="openshift-console/console-cdfc4cb68-9k284" Apr 21 04:00:18.559488 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:18.559464 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0542ee55-ebd2-4c1f-888f-a044bc8cc7dc-console-serving-cert\") pod \"console-cdfc4cb68-9k284\" (UID: \"0542ee55-ebd2-4c1f-888f-a044bc8cc7dc\") " pod="openshift-console/console-cdfc4cb68-9k284" Apr 21 04:00:18.594346 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:18.593890 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cdfc4cb68-9k284" Apr 21 04:00:18.824829 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:18.824126 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-cdfc4cb68-9k284"] Apr 21 04:00:19.462098 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:19.462036 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zmnts" event={"ID":"aa4733fb-75e2-4c77-bd5d-7ea904966689","Type":"ContainerStarted","Data":"466a8693b6f4ec8e37037cbcacdc2e0ef665e9c730af19bd5eee70525ec4f800"} Apr 21 04:00:19.462098 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:19.462078 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zmnts" event={"ID":"aa4733fb-75e2-4c77-bd5d-7ea904966689","Type":"ContainerStarted","Data":"b3db86c47dc4a58c28103ab8a9cb39a4f0f580252209bdb0750a5a0e1ee9bee3"} Apr 21 04:00:19.483450 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:19.483389 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-zmnts" podStartSLOduration=10.633497469 podStartE2EDuration="11.483370545s" podCreationTimestamp="2026-04-21 04:00:08 +0000 UTC" firstStartedPulling="2026-04-21 04:00:16.917310088 +0000 UTC m=+160.586749140" lastFinishedPulling="2026-04-21 04:00:17.76718316 +0000 UTC m=+161.436622216" observedRunningTime="2026-04-21 04:00:19.481781495 +0000 UTC m=+163.151220571" watchObservedRunningTime="2026-04-21 04:00:19.483370545 +0000 UTC m=+163.152809623" Apr 21 04:00:20.171026 ip-10-0-134-136 kubenswrapper[2578]: W0421 04:00:20.170821 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0542ee55_ebd2_4c1f_888f_a044bc8cc7dc.slice/crio-1765514563bfd08b79df0dc5c745eaecd1f1c84cacb138c68c8ce828e332c2bf WatchSource:0}: Error finding container 1765514563bfd08b79df0dc5c745eaecd1f1c84cacb138c68c8ce828e332c2bf: Status 404 returned error can't find the container with id 1765514563bfd08b79df0dc5c745eaecd1f1c84cacb138c68c8ce828e332c2bf Apr 21 04:00:20.466489 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:20.466402 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cdfc4cb68-9k284" event={"ID":"0542ee55-ebd2-4c1f-888f-a044bc8cc7dc","Type":"ContainerStarted","Data":"1765514563bfd08b79df0dc5c745eaecd1f1c84cacb138c68c8ce828e332c2bf"} Apr 21 04:00:26.487070 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:26.487033 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-sjcwm" event={"ID":"e9350a7b-5fe0-4e30-8c05-5ee260472029","Type":"ContainerStarted","Data":"d0d8b4508b0a31df55eb6fbf197aa79d6372d7646c8578fbf21f6cfd64e19f32"} Apr 21 04:00:26.488655 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:26.488620 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7bb9c48849-mjnfz" event={"ID":"a2c56362-323c-41f4-a1ca-b564476fd1a1","Type":"ContainerStarted","Data":"56648fc0780cb23a2a24dae097ff906797d4b16df35990f5027c00e212010122"} Apr 21 04:00:26.490159 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:26.490128 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-fd55fd488-h8sm5" event={"ID":"972cb6a5-a2e1-4220-b72b-f7d3cf55ce7a","Type":"ContainerStarted","Data":"790cb20620daba9a2f0424efc29e7885e6b1fbeef1eec44e28231d47a5ece968"} Apr 21 04:00:26.491701 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:26.491674 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r88vj" event={"ID":"976a9808-8386-44d6-964f-4f35e8b7bf8f","Type":"ContainerStarted","Data":"a2a8d039cae9f94aa4a410de6cf7eac171da12b76e2bc4daeb868241bc766f85"} Apr 21 04:00:26.491886 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:26.491868 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r88vj" Apr 21 04:00:26.493213 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:26.493183 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cdfc4cb68-9k284" event={"ID":"0542ee55-ebd2-4c1f-888f-a044bc8cc7dc","Type":"ContainerStarted","Data":"cf47a063c8fa3bb3aec3c346b79b1e2f4185d05ed41135c23b1ede8b00e76f2f"} Apr 21 04:00:26.494840 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:26.494816 2578 generic.go:358] "Generic (PLEG): container finished" podID="d40cad4f-6109-4381-b7f5-2a037fc43ff8" containerID="5981232c31672bf11741f5962ea983b74763ceb5329bfc3128c7194c6df2a378" exitCode=0 Apr 21 04:00:26.494958 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:26.494903 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d40cad4f-6109-4381-b7f5-2a037fc43ff8","Type":"ContainerDied","Data":"5981232c31672bf11741f5962ea983b74763ceb5329bfc3128c7194c6df2a378"} Apr 21 04:00:26.498024 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:26.497611 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6c69947b7-mcmmj" event={"ID":"b59fc3b6-a6a1-4455-b9b3-ae8335e2a143","Type":"ContainerStarted","Data":"de96bc95eacd4cdebf92c666589cdf793dfae3c56f90c07f669de12a6d40a6e1"} Apr 21 04:00:26.498024 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:26.497641 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6c69947b7-mcmmj" event={"ID":"b59fc3b6-a6a1-4455-b9b3-ae8335e2a143","Type":"ContainerStarted","Data":"605e6147de6c451368d774ed4d2370cc54eeeaf26906de1a9f14838aa761b4d7"} Apr 21 04:00:26.498024 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:26.497656 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6c69947b7-mcmmj" event={"ID":"b59fc3b6-a6a1-4455-b9b3-ae8335e2a143","Type":"ContainerStarted","Data":"e529bd5fc421b90ea24e8128b074f1cb9b86b3824f42f290ce34c1cb5fe0fa90"} Apr 21 04:00:26.498492 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:26.498473 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r88vj" Apr 21 04:00:26.499779 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:26.499758 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lk9pj" event={"ID":"5b9715dc-7dd5-46ea-961a-2f02107a7655","Type":"ContainerStarted","Data":"8b12c2d296ede7903772c4c878714b337c420f7adf1e8722944143de7bc7170d"} Apr 21 04:00:26.499865 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:26.499785 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lk9pj" event={"ID":"5b9715dc-7dd5-46ea-961a-2f02107a7655","Type":"ContainerStarted","Data":"407c7ed172fdc1caec1c7e057098b30d39066c0c86e2659b2d940234a9e092d7"} Apr 21 04:00:26.499939 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:26.499924 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-lk9pj" Apr 21 04:00:26.503107 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:26.503059 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-sjcwm" podStartSLOduration=130.004118425 podStartE2EDuration="2m18.50304468s" podCreationTimestamp="2026-04-21 03:58:08 +0000 UTC" firstStartedPulling="2026-04-21 04:00:17.094909267 +0000 UTC m=+160.764348337" lastFinishedPulling="2026-04-21 04:00:25.593835534 +0000 UTC m=+169.263274592" observedRunningTime="2026-04-21 04:00:26.502059724 +0000 UTC m=+170.171498820" watchObservedRunningTime="2026-04-21 04:00:26.50304468 +0000 UTC m=+170.172483768" Apr 21 04:00:26.519030 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:26.518912 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-cdfc4cb68-9k284" podStartSLOduration=2.8079207889999998 podStartE2EDuration="8.518893683s" podCreationTimestamp="2026-04-21 04:00:18 +0000 UTC" firstStartedPulling="2026-04-21 04:00:20.172728633 +0000 UTC m=+163.842167695" lastFinishedPulling="2026-04-21 04:00:25.883701533 +0000 UTC m=+169.553140589" observedRunningTime="2026-04-21 04:00:26.518308549 +0000 UTC m=+170.187747626" watchObservedRunningTime="2026-04-21 04:00:26.518893683 +0000 UTC m=+170.188332758" Apr 21 04:00:26.536385 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:26.536334 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7bb9c48849-mjnfz" podStartSLOduration=8.284684617 podStartE2EDuration="16.536318588s" podCreationTimestamp="2026-04-21 04:00:10 +0000 UTC" firstStartedPulling="2026-04-21 04:00:17.349596294 +0000 UTC m=+161.019035347" lastFinishedPulling="2026-04-21 04:00:25.601230253 +0000 UTC m=+169.270669318" observedRunningTime="2026-04-21 04:00:26.535766961 +0000 UTC m=+170.205206037" watchObservedRunningTime="2026-04-21 04:00:26.536318588 +0000 UTC m=+170.205757663" Apr 21 04:00:26.551270 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:26.551212 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-fd55fd488-h8sm5" podStartSLOduration=6.304840041 podStartE2EDuration="14.551195645s" podCreationTimestamp="2026-04-21 04:00:12 +0000 UTC" firstStartedPulling="2026-04-21 04:00:17.347501522 +0000 UTC m=+161.016940574" lastFinishedPulling="2026-04-21 04:00:25.593857114 +0000 UTC m=+169.263296178" observedRunningTime="2026-04-21 04:00:26.55078448 +0000 UTC m=+170.220223559" watchObservedRunningTime="2026-04-21 04:00:26.551195645 +0000 UTC m=+170.220634721" Apr 21 04:00:26.590138 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:26.590088 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-lk9pj" podStartSLOduration=130.1179212 podStartE2EDuration="2m18.590072193s" podCreationTimestamp="2026-04-21 03:58:08 +0000 UTC" firstStartedPulling="2026-04-21 04:00:17.121023924 +0000 UTC m=+160.790462978" lastFinishedPulling="2026-04-21 04:00:25.593174911 +0000 UTC m=+169.262613971" observedRunningTime="2026-04-21 04:00:26.589654274 +0000 UTC m=+170.259093352" watchObservedRunningTime="2026-04-21 04:00:26.590072193 +0000 UTC m=+170.259511268" Apr 21 04:00:26.605169 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:26.605110 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r88vj" podStartSLOduration=6.356388599 podStartE2EDuration="14.605090733s" podCreationTimestamp="2026-04-21 04:00:12 +0000 UTC" firstStartedPulling="2026-04-21 04:00:17.345008865 +0000 UTC m=+161.014447930" lastFinishedPulling="2026-04-21 04:00:25.593711006 +0000 UTC m=+169.263150064" observedRunningTime="2026-04-21 04:00:26.603801086 +0000 UTC m=+170.273240163" watchObservedRunningTime="2026-04-21 04:00:26.605090733 +0000 UTC m=+170.274529809" Apr 21 04:00:26.928500 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:26.928458 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2lrq9" Apr 21 04:00:28.508298 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:28.508251 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6c69947b7-mcmmj" event={"ID":"b59fc3b6-a6a1-4455-b9b3-ae8335e2a143","Type":"ContainerStarted","Data":"0e3c901b435904ea8f6dac4cb5a4dba572eb7842794c27510df9e33118a834d6"} Apr 21 04:00:28.594896 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:28.594864 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-cdfc4cb68-9k284" Apr 21 04:00:28.594896 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:28.594895 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-cdfc4cb68-9k284" Apr 21 04:00:28.599672 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:28.599644 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-cdfc4cb68-9k284" Apr 21 04:00:29.516664 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:29.516622 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d40cad4f-6109-4381-b7f5-2a037fc43ff8","Type":"ContainerStarted","Data":"21cecbf4925efba46baacc22b998ebd0018539e509c503aaf1692ed7c7ded8cc"} Apr 21 04:00:29.516664 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:29.516668 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d40cad4f-6109-4381-b7f5-2a037fc43ff8","Type":"ContainerStarted","Data":"41b84b76ee99b1fa4ac51c2ac9f66057d4eede5e527f17a61f5c261b8e2266d1"} Apr 21 04:00:29.517130 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:29.516682 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d40cad4f-6109-4381-b7f5-2a037fc43ff8","Type":"ContainerStarted","Data":"d9fa315a3c7d1735c87ea493f877bb6ebfd07ce084fa8d82e82c99800f888023"} Apr 21 04:00:29.517130 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:29.516695 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d40cad4f-6109-4381-b7f5-2a037fc43ff8","Type":"ContainerStarted","Data":"6d8fa447a4b41fc877916f05f14e85b11d3767814f12e9c224e0936c7c4e0bc2"} Apr 21 04:00:29.517130 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:29.516708 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d40cad4f-6109-4381-b7f5-2a037fc43ff8","Type":"ContainerStarted","Data":"2bf2c42ce6087b7e9bb0b6e8cd92fec4d2305208a5e03ed8c18a6290f7ea27bc"} Apr 21 04:00:29.517130 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:29.516720 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d40cad4f-6109-4381-b7f5-2a037fc43ff8","Type":"ContainerStarted","Data":"4537f4f441d81ca6d8969dfe7ce79981942a3f0378721423f0891bd28bdd4c03"} Apr 21 04:00:29.519164 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:29.519136 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6c69947b7-mcmmj" event={"ID":"b59fc3b6-a6a1-4455-b9b3-ae8335e2a143","Type":"ContainerStarted","Data":"8785a24da89ad34660383f7d0f3df73e6d357988ede1c76ed1ace211f8d99392"} Apr 21 04:00:29.519319 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:29.519170 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6c69947b7-mcmmj" event={"ID":"b59fc3b6-a6a1-4455-b9b3-ae8335e2a143","Type":"ContainerStarted","Data":"2cbf8e6fb59ab3056c3383dcc73ff725a7eb03d85a1e65c06d3b5e38fdae7621"} Apr 21 04:00:29.523335 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:29.523313 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-cdfc4cb68-9k284" Apr 21 04:00:29.541616 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:29.541526 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=9.246997156 podStartE2EDuration="20.54149056s" podCreationTimestamp="2026-04-21 04:00:09 +0000 UTC" firstStartedPulling="2026-04-21 04:00:17.397672977 +0000 UTC m=+161.067112036" lastFinishedPulling="2026-04-21 04:00:28.692166383 +0000 UTC m=+172.361605440" observedRunningTime="2026-04-21 04:00:29.540266319 +0000 UTC m=+173.209705419" watchObservedRunningTime="2026-04-21 04:00:29.54149056 +0000 UTC m=+173.210929634" Apr 21 04:00:29.564075 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:29.564017 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-6c69947b7-mcmmj" podStartSLOduration=7.651477367 podStartE2EDuration="18.563996691s" podCreationTimestamp="2026-04-21 04:00:11 +0000 UTC" firstStartedPulling="2026-04-21 04:00:17.397752571 +0000 UTC m=+161.067191638" lastFinishedPulling="2026-04-21 04:00:28.310271906 +0000 UTC m=+171.979710962" observedRunningTime="2026-04-21 04:00:29.56239831 +0000 UTC m=+173.231837386" watchObservedRunningTime="2026-04-21 04:00:29.563996691 +0000 UTC m=+173.233435769" Apr 21 04:00:29.606234 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:29.606151 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7bb9c48849-mjnfz"] Apr 21 04:00:30.523661 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:30.523622 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-6c69947b7-mcmmj" Apr 21 04:00:30.754817 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:30.754776 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7bb9c48849-mjnfz" Apr 21 04:00:31.532554 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:31.532528 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-6c69947b7-mcmmj" Apr 21 04:00:32.753042 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:32.753002 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-fd55fd488-h8sm5" Apr 21 04:00:32.753042 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:32.753045 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-fd55fd488-h8sm5" Apr 21 04:00:36.505366 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:36.505337 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-lk9pj" Apr 21 04:00:50.589190 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:50.589152 2578 generic.go:358] "Generic (PLEG): container finished" podID="047ff53e-3808-49b6-ad81-7bd15d251053" containerID="9cceb391afbd9e1e3d468d358d3d76edac154e43cef76960b881ae2dec7afb4e" exitCode=0 Apr 21 04:00:50.589587 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:50.589223 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-pfkbf" event={"ID":"047ff53e-3808-49b6-ad81-7bd15d251053","Type":"ContainerDied","Data":"9cceb391afbd9e1e3d468d358d3d76edac154e43cef76960b881ae2dec7afb4e"} Apr 21 04:00:50.589643 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:50.589621 2578 scope.go:117] "RemoveContainer" containerID="9cceb391afbd9e1e3d468d358d3d76edac154e43cef76960b881ae2dec7afb4e" Apr 21 04:00:51.594069 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:51.594035 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-pfkbf" event={"ID":"047ff53e-3808-49b6-ad81-7bd15d251053","Type":"ContainerStarted","Data":"287b5dba04cdcfa64b69705444fe7999e28cc6be8265c5c2309b2537b02a01ae"} Apr 21 04:00:52.759036 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:52.759006 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-fd55fd488-h8sm5" Apr 21 04:00:52.762860 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:52.762837 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-fd55fd488-h8sm5" Apr 21 04:00:54.625558 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:54.625500 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7bb9c48849-mjnfz" podUID="a2c56362-323c-41f4-a1ca-b564476fd1a1" containerName="console" containerID="cri-o://56648fc0780cb23a2a24dae097ff906797d4b16df35990f5027c00e212010122" gracePeriod=15 Apr 21 04:00:54.877386 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:54.877330 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7bb9c48849-mjnfz_a2c56362-323c-41f4-a1ca-b564476fd1a1/console/0.log" Apr 21 04:00:54.877386 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:54.877386 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7bb9c48849-mjnfz" Apr 21 04:00:54.979656 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:54.979618 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-td6mr\" (UniqueName: \"kubernetes.io/projected/a2c56362-323c-41f4-a1ca-b564476fd1a1-kube-api-access-td6mr\") pod \"a2c56362-323c-41f4-a1ca-b564476fd1a1\" (UID: \"a2c56362-323c-41f4-a1ca-b564476fd1a1\") " Apr 21 04:00:54.979656 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:54.979665 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a2c56362-323c-41f4-a1ca-b564476fd1a1-console-serving-cert\") pod \"a2c56362-323c-41f4-a1ca-b564476fd1a1\" (UID: \"a2c56362-323c-41f4-a1ca-b564476fd1a1\") " Apr 21 04:00:54.979858 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:54.979699 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2c56362-323c-41f4-a1ca-b564476fd1a1-trusted-ca-bundle\") pod \"a2c56362-323c-41f4-a1ca-b564476fd1a1\" (UID: \"a2c56362-323c-41f4-a1ca-b564476fd1a1\") " Apr 21 04:00:54.979858 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:54.979746 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a2c56362-323c-41f4-a1ca-b564476fd1a1-console-oauth-config\") pod \"a2c56362-323c-41f4-a1ca-b564476fd1a1\" (UID: \"a2c56362-323c-41f4-a1ca-b564476fd1a1\") " Apr 21 04:00:54.979858 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:54.979762 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a2c56362-323c-41f4-a1ca-b564476fd1a1-oauth-serving-cert\") pod \"a2c56362-323c-41f4-a1ca-b564476fd1a1\" (UID: \"a2c56362-323c-41f4-a1ca-b564476fd1a1\") " Apr 21 04:00:54.979858 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:54.979790 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a2c56362-323c-41f4-a1ca-b564476fd1a1-service-ca\") pod \"a2c56362-323c-41f4-a1ca-b564476fd1a1\" (UID: \"a2c56362-323c-41f4-a1ca-b564476fd1a1\") " Apr 21 04:00:54.979858 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:54.979841 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a2c56362-323c-41f4-a1ca-b564476fd1a1-console-config\") pod \"a2c56362-323c-41f4-a1ca-b564476fd1a1\" (UID: \"a2c56362-323c-41f4-a1ca-b564476fd1a1\") " Apr 21 04:00:54.980218 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:54.980175 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2c56362-323c-41f4-a1ca-b564476fd1a1-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "a2c56362-323c-41f4-a1ca-b564476fd1a1" (UID: "a2c56362-323c-41f4-a1ca-b564476fd1a1"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:00:54.980389 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:54.980307 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2c56362-323c-41f4-a1ca-b564476fd1a1-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a2c56362-323c-41f4-a1ca-b564476fd1a1" (UID: "a2c56362-323c-41f4-a1ca-b564476fd1a1"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:00:54.980389 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:54.980232 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2c56362-323c-41f4-a1ca-b564476fd1a1-service-ca" (OuterVolumeSpecName: "service-ca") pod "a2c56362-323c-41f4-a1ca-b564476fd1a1" (UID: "a2c56362-323c-41f4-a1ca-b564476fd1a1"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:00:54.980389 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:54.980383 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2c56362-323c-41f4-a1ca-b564476fd1a1-console-config" (OuterVolumeSpecName: "console-config") pod "a2c56362-323c-41f4-a1ca-b564476fd1a1" (UID: "a2c56362-323c-41f4-a1ca-b564476fd1a1"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:00:54.982093 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:54.982070 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2c56362-323c-41f4-a1ca-b564476fd1a1-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a2c56362-323c-41f4-a1ca-b564476fd1a1" (UID: "a2c56362-323c-41f4-a1ca-b564476fd1a1"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:00:54.982175 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:54.982100 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2c56362-323c-41f4-a1ca-b564476fd1a1-kube-api-access-td6mr" (OuterVolumeSpecName: "kube-api-access-td6mr") pod "a2c56362-323c-41f4-a1ca-b564476fd1a1" (UID: "a2c56362-323c-41f4-a1ca-b564476fd1a1"). InnerVolumeSpecName "kube-api-access-td6mr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:00:54.982229 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:54.982177 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2c56362-323c-41f4-a1ca-b564476fd1a1-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a2c56362-323c-41f4-a1ca-b564476fd1a1" (UID: "a2c56362-323c-41f4-a1ca-b564476fd1a1"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:00:55.080680 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:55.080642 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a2c56362-323c-41f4-a1ca-b564476fd1a1-service-ca\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:00:55.080680 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:55.080671 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a2c56362-323c-41f4-a1ca-b564476fd1a1-console-config\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:00:55.080680 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:55.080681 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-td6mr\" (UniqueName: \"kubernetes.io/projected/a2c56362-323c-41f4-a1ca-b564476fd1a1-kube-api-access-td6mr\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:00:55.080680 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:55.080692 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a2c56362-323c-41f4-a1ca-b564476fd1a1-console-serving-cert\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:00:55.080932 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:55.080700 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2c56362-323c-41f4-a1ca-b564476fd1a1-trusted-ca-bundle\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:00:55.080932 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:55.080709 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a2c56362-323c-41f4-a1ca-b564476fd1a1-console-oauth-config\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:00:55.080932 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:55.080718 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a2c56362-323c-41f4-a1ca-b564476fd1a1-oauth-serving-cert\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:00:55.605776 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:55.605746 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7bb9c48849-mjnfz_a2c56362-323c-41f4-a1ca-b564476fd1a1/console/0.log" Apr 21 04:00:55.605977 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:55.605786 2578 generic.go:358] "Generic (PLEG): container finished" podID="a2c56362-323c-41f4-a1ca-b564476fd1a1" containerID="56648fc0780cb23a2a24dae097ff906797d4b16df35990f5027c00e212010122" exitCode=2 Apr 21 04:00:55.605977 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:55.605819 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7bb9c48849-mjnfz" event={"ID":"a2c56362-323c-41f4-a1ca-b564476fd1a1","Type":"ContainerDied","Data":"56648fc0780cb23a2a24dae097ff906797d4b16df35990f5027c00e212010122"} Apr 21 04:00:55.605977 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:55.605859 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7bb9c48849-mjnfz" event={"ID":"a2c56362-323c-41f4-a1ca-b564476fd1a1","Type":"ContainerDied","Data":"1512ddb5dc9a701a9714a791675a6267f47d50c9bb7a4f884adf2abc49309f71"} Apr 21 04:00:55.605977 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:55.605861 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7bb9c48849-mjnfz" Apr 21 04:00:55.605977 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:55.605876 2578 scope.go:117] "RemoveContainer" containerID="56648fc0780cb23a2a24dae097ff906797d4b16df35990f5027c00e212010122" Apr 21 04:00:55.619975 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:55.619955 2578 scope.go:117] "RemoveContainer" containerID="56648fc0780cb23a2a24dae097ff906797d4b16df35990f5027c00e212010122" Apr 21 04:00:55.620270 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:00:55.620249 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56648fc0780cb23a2a24dae097ff906797d4b16df35990f5027c00e212010122\": container with ID starting with 56648fc0780cb23a2a24dae097ff906797d4b16df35990f5027c00e212010122 not found: ID does not exist" containerID="56648fc0780cb23a2a24dae097ff906797d4b16df35990f5027c00e212010122" Apr 21 04:00:55.620373 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:55.620322 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56648fc0780cb23a2a24dae097ff906797d4b16df35990f5027c00e212010122"} err="failed to get container status \"56648fc0780cb23a2a24dae097ff906797d4b16df35990f5027c00e212010122\": rpc error: code = NotFound desc = could not find container \"56648fc0780cb23a2a24dae097ff906797d4b16df35990f5027c00e212010122\": container with ID starting with 56648fc0780cb23a2a24dae097ff906797d4b16df35990f5027c00e212010122 not found: ID does not exist" Apr 21 04:00:55.626981 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:55.626956 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7bb9c48849-mjnfz"] Apr 21 04:00:55.630434 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:55.630412 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7bb9c48849-mjnfz"] Apr 21 04:00:56.928398 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:00:56.928359 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2c56362-323c-41f4-a1ca-b564476fd1a1" path="/var/lib/kubelet/pods/a2c56362-323c-41f4-a1ca-b564476fd1a1/volumes" Apr 21 04:01:28.270077 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:28.270041 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 04:01:28.270585 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:28.270497 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d40cad4f-6109-4381-b7f5-2a037fc43ff8" containerName="alertmanager" containerID="cri-o://4537f4f441d81ca6d8969dfe7ce79981942a3f0378721423f0891bd28bdd4c03" gracePeriod=120 Apr 21 04:01:28.270646 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:28.270565 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d40cad4f-6109-4381-b7f5-2a037fc43ff8" containerName="kube-rbac-proxy-metric" containerID="cri-o://41b84b76ee99b1fa4ac51c2ac9f66057d4eede5e527f17a61f5c261b8e2266d1" gracePeriod=120 Apr 21 04:01:28.270646 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:28.270600 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d40cad4f-6109-4381-b7f5-2a037fc43ff8" containerName="kube-rbac-proxy-web" containerID="cri-o://6d8fa447a4b41fc877916f05f14e85b11d3767814f12e9c224e0936c7c4e0bc2" gracePeriod=120 Apr 21 04:01:28.270646 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:28.270616 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d40cad4f-6109-4381-b7f5-2a037fc43ff8" containerName="config-reloader" containerID="cri-o://2bf2c42ce6087b7e9bb0b6e8cd92fec4d2305208a5e03ed8c18a6290f7ea27bc" gracePeriod=120 Apr 21 04:01:28.270798 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:28.270633 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d40cad4f-6109-4381-b7f5-2a037fc43ff8" containerName="kube-rbac-proxy" containerID="cri-o://d9fa315a3c7d1735c87ea493f877bb6ebfd07ce084fa8d82e82c99800f888023" gracePeriod=120 Apr 21 04:01:28.270798 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:28.270676 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d40cad4f-6109-4381-b7f5-2a037fc43ff8" containerName="prom-label-proxy" containerID="cri-o://21cecbf4925efba46baacc22b998ebd0018539e509c503aaf1692ed7c7ded8cc" gracePeriod=120 Apr 21 04:01:28.704487 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:28.704453 2578 generic.go:358] "Generic (PLEG): container finished" podID="d40cad4f-6109-4381-b7f5-2a037fc43ff8" containerID="21cecbf4925efba46baacc22b998ebd0018539e509c503aaf1692ed7c7ded8cc" exitCode=0 Apr 21 04:01:28.704487 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:28.704480 2578 generic.go:358] "Generic (PLEG): container finished" podID="d40cad4f-6109-4381-b7f5-2a037fc43ff8" containerID="41b84b76ee99b1fa4ac51c2ac9f66057d4eede5e527f17a61f5c261b8e2266d1" exitCode=0 Apr 21 04:01:28.704487 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:28.704488 2578 generic.go:358] "Generic (PLEG): container finished" podID="d40cad4f-6109-4381-b7f5-2a037fc43ff8" containerID="d9fa315a3c7d1735c87ea493f877bb6ebfd07ce084fa8d82e82c99800f888023" exitCode=0 Apr 21 04:01:28.704487 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:28.704495 2578 generic.go:358] "Generic (PLEG): container finished" podID="d40cad4f-6109-4381-b7f5-2a037fc43ff8" containerID="2bf2c42ce6087b7e9bb0b6e8cd92fec4d2305208a5e03ed8c18a6290f7ea27bc" exitCode=0 Apr 21 04:01:28.704753 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:28.704502 2578 generic.go:358] "Generic (PLEG): container finished" podID="d40cad4f-6109-4381-b7f5-2a037fc43ff8" containerID="4537f4f441d81ca6d8969dfe7ce79981942a3f0378721423f0891bd28bdd4c03" exitCode=0 Apr 21 04:01:28.704753 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:28.704527 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d40cad4f-6109-4381-b7f5-2a037fc43ff8","Type":"ContainerDied","Data":"21cecbf4925efba46baacc22b998ebd0018539e509c503aaf1692ed7c7ded8cc"} Apr 21 04:01:28.704753 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:28.704559 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d40cad4f-6109-4381-b7f5-2a037fc43ff8","Type":"ContainerDied","Data":"41b84b76ee99b1fa4ac51c2ac9f66057d4eede5e527f17a61f5c261b8e2266d1"} Apr 21 04:01:28.704753 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:28.704569 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d40cad4f-6109-4381-b7f5-2a037fc43ff8","Type":"ContainerDied","Data":"d9fa315a3c7d1735c87ea493f877bb6ebfd07ce084fa8d82e82c99800f888023"} Apr 21 04:01:28.704753 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:28.704581 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d40cad4f-6109-4381-b7f5-2a037fc43ff8","Type":"ContainerDied","Data":"2bf2c42ce6087b7e9bb0b6e8cd92fec4d2305208a5e03ed8c18a6290f7ea27bc"} Apr 21 04:01:28.704753 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:28.704589 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d40cad4f-6109-4381-b7f5-2a037fc43ff8","Type":"ContainerDied","Data":"4537f4f441d81ca6d8969dfe7ce79981942a3f0378721423f0891bd28bdd4c03"} Apr 21 04:01:29.508969 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.508944 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:01:29.577845 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.577807 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntxnm\" (UniqueName: \"kubernetes.io/projected/d40cad4f-6109-4381-b7f5-2a037fc43ff8-kube-api-access-ntxnm\") pod \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " Apr 21 04:01:29.577845 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.577842 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d40cad4f-6109-4381-b7f5-2a037fc43ff8-metrics-client-ca\") pod \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " Apr 21 04:01:29.578078 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.577862 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d40cad4f-6109-4381-b7f5-2a037fc43ff8-secret-alertmanager-kube-rbac-proxy-web\") pod \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " Apr 21 04:01:29.578078 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.577888 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d40cad4f-6109-4381-b7f5-2a037fc43ff8-secret-alertmanager-main-tls\") pod \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " Apr 21 04:01:29.578078 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.577912 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d40cad4f-6109-4381-b7f5-2a037fc43ff8-alertmanager-main-db\") pod \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " Apr 21 04:01:29.578078 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.577936 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d40cad4f-6109-4381-b7f5-2a037fc43ff8-secret-alertmanager-kube-rbac-proxy\") pod \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " Apr 21 04:01:29.578078 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.577978 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d40cad4f-6109-4381-b7f5-2a037fc43ff8-web-config\") pod \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " Apr 21 04:01:29.578078 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.578004 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d40cad4f-6109-4381-b7f5-2a037fc43ff8-cluster-tls-config\") pod \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " Apr 21 04:01:29.578078 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.578045 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d40cad4f-6109-4381-b7f5-2a037fc43ff8-tls-assets\") pod \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " Apr 21 04:01:29.578078 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.578073 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d40cad4f-6109-4381-b7f5-2a037fc43ff8-config-out\") pod \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " Apr 21 04:01:29.578497 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.578099 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d40cad4f-6109-4381-b7f5-2a037fc43ff8-config-volume\") pod \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " Apr 21 04:01:29.578497 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.578125 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d40cad4f-6109-4381-b7f5-2a037fc43ff8-secret-alertmanager-kube-rbac-proxy-metric\") pod \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " Apr 21 04:01:29.578497 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.578151 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d40cad4f-6109-4381-b7f5-2a037fc43ff8-alertmanager-trusted-ca-bundle\") pod \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\" (UID: \"d40cad4f-6109-4381-b7f5-2a037fc43ff8\") " Apr 21 04:01:29.578497 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.578252 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d40cad4f-6109-4381-b7f5-2a037fc43ff8-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "d40cad4f-6109-4381-b7f5-2a037fc43ff8" (UID: "d40cad4f-6109-4381-b7f5-2a037fc43ff8"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:01:29.578497 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.578267 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d40cad4f-6109-4381-b7f5-2a037fc43ff8-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "d40cad4f-6109-4381-b7f5-2a037fc43ff8" (UID: "d40cad4f-6109-4381-b7f5-2a037fc43ff8"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:01:29.578749 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.578650 2578 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d40cad4f-6109-4381-b7f5-2a037fc43ff8-metrics-client-ca\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:01:29.578749 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.578676 2578 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d40cad4f-6109-4381-b7f5-2a037fc43ff8-alertmanager-main-db\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:01:29.578852 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.578764 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d40cad4f-6109-4381-b7f5-2a037fc43ff8-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "d40cad4f-6109-4381-b7f5-2a037fc43ff8" (UID: "d40cad4f-6109-4381-b7f5-2a037fc43ff8"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:01:29.581567 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.581424 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d40cad4f-6109-4381-b7f5-2a037fc43ff8-kube-api-access-ntxnm" (OuterVolumeSpecName: "kube-api-access-ntxnm") pod "d40cad4f-6109-4381-b7f5-2a037fc43ff8" (UID: "d40cad4f-6109-4381-b7f5-2a037fc43ff8"). InnerVolumeSpecName "kube-api-access-ntxnm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:01:29.581567 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.581505 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d40cad4f-6109-4381-b7f5-2a037fc43ff8-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "d40cad4f-6109-4381-b7f5-2a037fc43ff8" (UID: "d40cad4f-6109-4381-b7f5-2a037fc43ff8"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:01:29.581866 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.581820 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d40cad4f-6109-4381-b7f5-2a037fc43ff8-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "d40cad4f-6109-4381-b7f5-2a037fc43ff8" (UID: "d40cad4f-6109-4381-b7f5-2a037fc43ff8"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:01:29.582000 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.581910 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d40cad4f-6109-4381-b7f5-2a037fc43ff8-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "d40cad4f-6109-4381-b7f5-2a037fc43ff8" (UID: "d40cad4f-6109-4381-b7f5-2a037fc43ff8"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:01:29.582101 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.582070 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d40cad4f-6109-4381-b7f5-2a037fc43ff8-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "d40cad4f-6109-4381-b7f5-2a037fc43ff8" (UID: "d40cad4f-6109-4381-b7f5-2a037fc43ff8"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:01:29.582421 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.582400 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d40cad4f-6109-4381-b7f5-2a037fc43ff8-config-volume" (OuterVolumeSpecName: "config-volume") pod "d40cad4f-6109-4381-b7f5-2a037fc43ff8" (UID: "d40cad4f-6109-4381-b7f5-2a037fc43ff8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:01:29.582633 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.582611 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d40cad4f-6109-4381-b7f5-2a037fc43ff8-config-out" (OuterVolumeSpecName: "config-out") pod "d40cad4f-6109-4381-b7f5-2a037fc43ff8" (UID: "d40cad4f-6109-4381-b7f5-2a037fc43ff8"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:01:29.582794 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.582771 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d40cad4f-6109-4381-b7f5-2a037fc43ff8-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "d40cad4f-6109-4381-b7f5-2a037fc43ff8" (UID: "d40cad4f-6109-4381-b7f5-2a037fc43ff8"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:01:29.586515 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.586491 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d40cad4f-6109-4381-b7f5-2a037fc43ff8-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "d40cad4f-6109-4381-b7f5-2a037fc43ff8" (UID: "d40cad4f-6109-4381-b7f5-2a037fc43ff8"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:01:29.592847 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.592824 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d40cad4f-6109-4381-b7f5-2a037fc43ff8-web-config" (OuterVolumeSpecName: "web-config") pod "d40cad4f-6109-4381-b7f5-2a037fc43ff8" (UID: "d40cad4f-6109-4381-b7f5-2a037fc43ff8"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:01:29.679351 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.679240 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ntxnm\" (UniqueName: \"kubernetes.io/projected/d40cad4f-6109-4381-b7f5-2a037fc43ff8-kube-api-access-ntxnm\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:01:29.679351 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.679273 2578 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d40cad4f-6109-4381-b7f5-2a037fc43ff8-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:01:29.679351 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.679307 2578 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d40cad4f-6109-4381-b7f5-2a037fc43ff8-secret-alertmanager-main-tls\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:01:29.679351 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.679316 2578 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d40cad4f-6109-4381-b7f5-2a037fc43ff8-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:01:29.679351 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.679328 2578 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d40cad4f-6109-4381-b7f5-2a037fc43ff8-web-config\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:01:29.679351 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.679338 2578 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d40cad4f-6109-4381-b7f5-2a037fc43ff8-cluster-tls-config\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:01:29.679351 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.679347 2578 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d40cad4f-6109-4381-b7f5-2a037fc43ff8-tls-assets\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:01:29.679351 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.679356 2578 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d40cad4f-6109-4381-b7f5-2a037fc43ff8-config-out\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:01:29.679691 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.679366 2578 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d40cad4f-6109-4381-b7f5-2a037fc43ff8-config-volume\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:01:29.679691 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.679374 2578 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d40cad4f-6109-4381-b7f5-2a037fc43ff8-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:01:29.679691 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.679382 2578 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d40cad4f-6109-4381-b7f5-2a037fc43ff8-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:01:29.710068 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.710041 2578 generic.go:358] "Generic (PLEG): container finished" podID="d40cad4f-6109-4381-b7f5-2a037fc43ff8" containerID="6d8fa447a4b41fc877916f05f14e85b11d3767814f12e9c224e0936c7c4e0bc2" exitCode=0 Apr 21 04:01:29.710212 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.710093 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d40cad4f-6109-4381-b7f5-2a037fc43ff8","Type":"ContainerDied","Data":"6d8fa447a4b41fc877916f05f14e85b11d3767814f12e9c224e0936c7c4e0bc2"} Apr 21 04:01:29.710212 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.710115 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d40cad4f-6109-4381-b7f5-2a037fc43ff8","Type":"ContainerDied","Data":"f34dce73f172e2371ae7e0c77bce858ffaa24f6c443b398f43dfc2f5a954246f"} Apr 21 04:01:29.710212 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.710131 2578 scope.go:117] "RemoveContainer" containerID="21cecbf4925efba46baacc22b998ebd0018539e509c503aaf1692ed7c7ded8cc" Apr 21 04:01:29.710212 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.710143 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:01:29.718135 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.718082 2578 scope.go:117] "RemoveContainer" containerID="41b84b76ee99b1fa4ac51c2ac9f66057d4eede5e527f17a61f5c261b8e2266d1" Apr 21 04:01:29.725088 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.725070 2578 scope.go:117] "RemoveContainer" containerID="d9fa315a3c7d1735c87ea493f877bb6ebfd07ce084fa8d82e82c99800f888023" Apr 21 04:01:29.731458 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.731435 2578 scope.go:117] "RemoveContainer" containerID="6d8fa447a4b41fc877916f05f14e85b11d3767814f12e9c224e0936c7c4e0bc2" Apr 21 04:01:29.733329 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.733304 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 04:01:29.736787 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.736762 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 04:01:29.737956 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.737936 2578 scope.go:117] "RemoveContainer" containerID="2bf2c42ce6087b7e9bb0b6e8cd92fec4d2305208a5e03ed8c18a6290f7ea27bc" Apr 21 04:01:29.744365 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.744348 2578 scope.go:117] "RemoveContainer" containerID="4537f4f441d81ca6d8969dfe7ce79981942a3f0378721423f0891bd28bdd4c03" Apr 21 04:01:29.750594 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.750576 2578 scope.go:117] "RemoveContainer" containerID="5981232c31672bf11741f5962ea983b74763ceb5329bfc3128c7194c6df2a378" Apr 21 04:01:29.756806 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.756790 2578 scope.go:117] "RemoveContainer" containerID="21cecbf4925efba46baacc22b998ebd0018539e509c503aaf1692ed7c7ded8cc" Apr 21 04:01:29.757095 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:01:29.757076 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21cecbf4925efba46baacc22b998ebd0018539e509c503aaf1692ed7c7ded8cc\": container with ID starting with 21cecbf4925efba46baacc22b998ebd0018539e509c503aaf1692ed7c7ded8cc not found: ID does not exist" containerID="21cecbf4925efba46baacc22b998ebd0018539e509c503aaf1692ed7c7ded8cc" Apr 21 04:01:29.757167 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.757107 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21cecbf4925efba46baacc22b998ebd0018539e509c503aaf1692ed7c7ded8cc"} err="failed to get container status \"21cecbf4925efba46baacc22b998ebd0018539e509c503aaf1692ed7c7ded8cc\": rpc error: code = NotFound desc = could not find container \"21cecbf4925efba46baacc22b998ebd0018539e509c503aaf1692ed7c7ded8cc\": container with ID starting with 21cecbf4925efba46baacc22b998ebd0018539e509c503aaf1692ed7c7ded8cc not found: ID does not exist" Apr 21 04:01:29.757167 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.757149 2578 scope.go:117] "RemoveContainer" containerID="41b84b76ee99b1fa4ac51c2ac9f66057d4eede5e527f17a61f5c261b8e2266d1" Apr 21 04:01:29.757503 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:01:29.757485 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41b84b76ee99b1fa4ac51c2ac9f66057d4eede5e527f17a61f5c261b8e2266d1\": container with ID starting with 41b84b76ee99b1fa4ac51c2ac9f66057d4eede5e527f17a61f5c261b8e2266d1 not found: ID does not exist" containerID="41b84b76ee99b1fa4ac51c2ac9f66057d4eede5e527f17a61f5c261b8e2266d1" Apr 21 04:01:29.757558 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.757519 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41b84b76ee99b1fa4ac51c2ac9f66057d4eede5e527f17a61f5c261b8e2266d1"} err="failed to get container status \"41b84b76ee99b1fa4ac51c2ac9f66057d4eede5e527f17a61f5c261b8e2266d1\": rpc error: code = NotFound desc = could not find container \"41b84b76ee99b1fa4ac51c2ac9f66057d4eede5e527f17a61f5c261b8e2266d1\": container with ID starting with 41b84b76ee99b1fa4ac51c2ac9f66057d4eede5e527f17a61f5c261b8e2266d1 not found: ID does not exist" Apr 21 04:01:29.757558 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.757534 2578 scope.go:117] "RemoveContainer" containerID="d9fa315a3c7d1735c87ea493f877bb6ebfd07ce084fa8d82e82c99800f888023" Apr 21 04:01:29.757785 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:01:29.757758 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9fa315a3c7d1735c87ea493f877bb6ebfd07ce084fa8d82e82c99800f888023\": container with ID starting with d9fa315a3c7d1735c87ea493f877bb6ebfd07ce084fa8d82e82c99800f888023 not found: ID does not exist" containerID="d9fa315a3c7d1735c87ea493f877bb6ebfd07ce084fa8d82e82c99800f888023" Apr 21 04:01:29.757826 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.757791 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9fa315a3c7d1735c87ea493f877bb6ebfd07ce084fa8d82e82c99800f888023"} err="failed to get container status \"d9fa315a3c7d1735c87ea493f877bb6ebfd07ce084fa8d82e82c99800f888023\": rpc error: code = NotFound desc = could not find container \"d9fa315a3c7d1735c87ea493f877bb6ebfd07ce084fa8d82e82c99800f888023\": container with ID starting with d9fa315a3c7d1735c87ea493f877bb6ebfd07ce084fa8d82e82c99800f888023 not found: ID does not exist" Apr 21 04:01:29.757826 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.757806 2578 scope.go:117] "RemoveContainer" containerID="6d8fa447a4b41fc877916f05f14e85b11d3767814f12e9c224e0936c7c4e0bc2" Apr 21 04:01:29.758010 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:01:29.757994 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d8fa447a4b41fc877916f05f14e85b11d3767814f12e9c224e0936c7c4e0bc2\": container with ID starting with 6d8fa447a4b41fc877916f05f14e85b11d3767814f12e9c224e0936c7c4e0bc2 not found: ID does not exist" containerID="6d8fa447a4b41fc877916f05f14e85b11d3767814f12e9c224e0936c7c4e0bc2" Apr 21 04:01:29.758069 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.758017 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d8fa447a4b41fc877916f05f14e85b11d3767814f12e9c224e0936c7c4e0bc2"} err="failed to get container status \"6d8fa447a4b41fc877916f05f14e85b11d3767814f12e9c224e0936c7c4e0bc2\": rpc error: code = NotFound desc = could not find container \"6d8fa447a4b41fc877916f05f14e85b11d3767814f12e9c224e0936c7c4e0bc2\": container with ID starting with 6d8fa447a4b41fc877916f05f14e85b11d3767814f12e9c224e0936c7c4e0bc2 not found: ID does not exist" Apr 21 04:01:29.758069 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.758039 2578 scope.go:117] "RemoveContainer" containerID="2bf2c42ce6087b7e9bb0b6e8cd92fec4d2305208a5e03ed8c18a6290f7ea27bc" Apr 21 04:01:29.758270 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:01:29.758255 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bf2c42ce6087b7e9bb0b6e8cd92fec4d2305208a5e03ed8c18a6290f7ea27bc\": container with ID starting with 2bf2c42ce6087b7e9bb0b6e8cd92fec4d2305208a5e03ed8c18a6290f7ea27bc not found: ID does not exist" containerID="2bf2c42ce6087b7e9bb0b6e8cd92fec4d2305208a5e03ed8c18a6290f7ea27bc" Apr 21 04:01:29.758330 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.758273 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bf2c42ce6087b7e9bb0b6e8cd92fec4d2305208a5e03ed8c18a6290f7ea27bc"} err="failed to get container status \"2bf2c42ce6087b7e9bb0b6e8cd92fec4d2305208a5e03ed8c18a6290f7ea27bc\": rpc error: code = NotFound desc = could not find container \"2bf2c42ce6087b7e9bb0b6e8cd92fec4d2305208a5e03ed8c18a6290f7ea27bc\": container with ID starting with 2bf2c42ce6087b7e9bb0b6e8cd92fec4d2305208a5e03ed8c18a6290f7ea27bc not found: ID does not exist" Apr 21 04:01:29.758330 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.758304 2578 scope.go:117] "RemoveContainer" containerID="4537f4f441d81ca6d8969dfe7ce79981942a3f0378721423f0891bd28bdd4c03" Apr 21 04:01:29.758522 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:01:29.758505 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4537f4f441d81ca6d8969dfe7ce79981942a3f0378721423f0891bd28bdd4c03\": container with ID starting with 4537f4f441d81ca6d8969dfe7ce79981942a3f0378721423f0891bd28bdd4c03 not found: ID does not exist" containerID="4537f4f441d81ca6d8969dfe7ce79981942a3f0378721423f0891bd28bdd4c03" Apr 21 04:01:29.758558 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.758526 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4537f4f441d81ca6d8969dfe7ce79981942a3f0378721423f0891bd28bdd4c03"} err="failed to get container status \"4537f4f441d81ca6d8969dfe7ce79981942a3f0378721423f0891bd28bdd4c03\": rpc error: code = NotFound desc = could not find container \"4537f4f441d81ca6d8969dfe7ce79981942a3f0378721423f0891bd28bdd4c03\": container with ID starting with 4537f4f441d81ca6d8969dfe7ce79981942a3f0378721423f0891bd28bdd4c03 not found: ID does not exist" Apr 21 04:01:29.758558 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.758550 2578 scope.go:117] "RemoveContainer" containerID="5981232c31672bf11741f5962ea983b74763ceb5329bfc3128c7194c6df2a378" Apr 21 04:01:29.758766 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:01:29.758751 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5981232c31672bf11741f5962ea983b74763ceb5329bfc3128c7194c6df2a378\": container with ID starting with 5981232c31672bf11741f5962ea983b74763ceb5329bfc3128c7194c6df2a378 not found: ID does not exist" containerID="5981232c31672bf11741f5962ea983b74763ceb5329bfc3128c7194c6df2a378" Apr 21 04:01:29.758818 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.758773 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5981232c31672bf11741f5962ea983b74763ceb5329bfc3128c7194c6df2a378"} err="failed to get container status \"5981232c31672bf11741f5962ea983b74763ceb5329bfc3128c7194c6df2a378\": rpc error: code = NotFound desc = could not find container \"5981232c31672bf11741f5962ea983b74763ceb5329bfc3128c7194c6df2a378\": container with ID starting with 5981232c31672bf11741f5962ea983b74763ceb5329bfc3128c7194c6df2a378 not found: ID does not exist" Apr 21 04:01:29.763393 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.763372 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 04:01:29.763713 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.763694 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d40cad4f-6109-4381-b7f5-2a037fc43ff8" containerName="alertmanager" Apr 21 04:01:29.763789 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.763714 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="d40cad4f-6109-4381-b7f5-2a037fc43ff8" containerName="alertmanager" Apr 21 04:01:29.763789 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.763725 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d40cad4f-6109-4381-b7f5-2a037fc43ff8" containerName="init-config-reloader" Apr 21 04:01:29.763789 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.763733 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="d40cad4f-6109-4381-b7f5-2a037fc43ff8" containerName="init-config-reloader" Apr 21 04:01:29.763789 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.763749 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d40cad4f-6109-4381-b7f5-2a037fc43ff8" containerName="kube-rbac-proxy-web" Apr 21 04:01:29.763789 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.763758 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="d40cad4f-6109-4381-b7f5-2a037fc43ff8" containerName="kube-rbac-proxy-web" Apr 21 04:01:29.763789 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.763769 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d40cad4f-6109-4381-b7f5-2a037fc43ff8" containerName="prom-label-proxy" Apr 21 04:01:29.763789 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.763778 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="d40cad4f-6109-4381-b7f5-2a037fc43ff8" containerName="prom-label-proxy" Apr 21 04:01:29.764108 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.763797 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d40cad4f-6109-4381-b7f5-2a037fc43ff8" containerName="kube-rbac-proxy" Apr 21 04:01:29.764108 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.763806 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="d40cad4f-6109-4381-b7f5-2a037fc43ff8" containerName="kube-rbac-proxy" Apr 21 04:01:29.764108 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.763824 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d40cad4f-6109-4381-b7f5-2a037fc43ff8" containerName="kube-rbac-proxy-metric" Apr 21 04:01:29.764108 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.763833 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="d40cad4f-6109-4381-b7f5-2a037fc43ff8" containerName="kube-rbac-proxy-metric" Apr 21 04:01:29.764108 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.763843 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a2c56362-323c-41f4-a1ca-b564476fd1a1" containerName="console" Apr 21 04:01:29.764108 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.763853 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2c56362-323c-41f4-a1ca-b564476fd1a1" containerName="console" Apr 21 04:01:29.764108 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.763863 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d40cad4f-6109-4381-b7f5-2a037fc43ff8" containerName="config-reloader" Apr 21 04:01:29.764108 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.763872 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="d40cad4f-6109-4381-b7f5-2a037fc43ff8" containerName="config-reloader" Apr 21 04:01:29.764108 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.763969 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="d40cad4f-6109-4381-b7f5-2a037fc43ff8" containerName="kube-rbac-proxy-web" Apr 21 04:01:29.764108 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.763984 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="d40cad4f-6109-4381-b7f5-2a037fc43ff8" containerName="kube-rbac-proxy" Apr 21 04:01:29.764108 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.763996 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="d40cad4f-6109-4381-b7f5-2a037fc43ff8" containerName="kube-rbac-proxy-metric" Apr 21 04:01:29.764108 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.764005 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="d40cad4f-6109-4381-b7f5-2a037fc43ff8" containerName="prom-label-proxy" Apr 21 04:01:29.764108 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.764015 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="d40cad4f-6109-4381-b7f5-2a037fc43ff8" containerName="config-reloader" Apr 21 04:01:29.764108 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.764024 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="a2c56362-323c-41f4-a1ca-b564476fd1a1" containerName="console" Apr 21 04:01:29.764108 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.764035 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="d40cad4f-6109-4381-b7f5-2a037fc43ff8" containerName="alertmanager" Apr 21 04:01:29.770267 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.770246 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:01:29.772853 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.772835 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 21 04:01:29.773114 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.773067 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 21 04:01:29.773114 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.773067 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 21 04:01:29.773254 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.773126 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 21 04:01:29.773254 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.773086 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 21 04:01:29.773254 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.773208 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 21 04:01:29.773528 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.773510 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-9cdtv\"" Apr 21 04:01:29.773661 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.773642 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 21 04:01:29.773939 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.773924 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 21 04:01:29.778911 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.778888 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 21 04:01:29.784663 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.780557 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 04:01:29.881265 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.881221 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/03548feb-c89f-456f-97c0-dd5867c02ca1-tls-assets\") pod \"alertmanager-main-0\" (UID: \"03548feb-c89f-456f-97c0-dd5867c02ca1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:01:29.881265 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.881268 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/03548feb-c89f-456f-97c0-dd5867c02ca1-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"03548feb-c89f-456f-97c0-dd5867c02ca1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:01:29.881505 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.881316 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/03548feb-c89f-456f-97c0-dd5867c02ca1-config-out\") pod \"alertmanager-main-0\" (UID: \"03548feb-c89f-456f-97c0-dd5867c02ca1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:01:29.881505 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.881392 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03548feb-c89f-456f-97c0-dd5867c02ca1-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"03548feb-c89f-456f-97c0-dd5867c02ca1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:01:29.881505 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.881424 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/03548feb-c89f-456f-97c0-dd5867c02ca1-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"03548feb-c89f-456f-97c0-dd5867c02ca1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:01:29.881505 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.881449 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/03548feb-c89f-456f-97c0-dd5867c02ca1-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"03548feb-c89f-456f-97c0-dd5867c02ca1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:01:29.881505 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.881486 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/03548feb-c89f-456f-97c0-dd5867c02ca1-config-volume\") pod \"alertmanager-main-0\" (UID: \"03548feb-c89f-456f-97c0-dd5867c02ca1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:01:29.881650 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.881545 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/03548feb-c89f-456f-97c0-dd5867c02ca1-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"03548feb-c89f-456f-97c0-dd5867c02ca1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:01:29.881650 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.881579 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/03548feb-c89f-456f-97c0-dd5867c02ca1-web-config\") pod \"alertmanager-main-0\" (UID: \"03548feb-c89f-456f-97c0-dd5867c02ca1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:01:29.881650 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.881595 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/03548feb-c89f-456f-97c0-dd5867c02ca1-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"03548feb-c89f-456f-97c0-dd5867c02ca1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:01:29.881650 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.881620 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/03548feb-c89f-456f-97c0-dd5867c02ca1-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"03548feb-c89f-456f-97c0-dd5867c02ca1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:01:29.881650 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.881642 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x42dm\" (UniqueName: \"kubernetes.io/projected/03548feb-c89f-456f-97c0-dd5867c02ca1-kube-api-access-x42dm\") pod \"alertmanager-main-0\" (UID: \"03548feb-c89f-456f-97c0-dd5867c02ca1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:01:29.881791 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.881689 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/03548feb-c89f-456f-97c0-dd5867c02ca1-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"03548feb-c89f-456f-97c0-dd5867c02ca1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:01:29.982836 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.982748 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/03548feb-c89f-456f-97c0-dd5867c02ca1-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"03548feb-c89f-456f-97c0-dd5867c02ca1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:01:29.982836 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.982789 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/03548feb-c89f-456f-97c0-dd5867c02ca1-web-config\") pod \"alertmanager-main-0\" (UID: \"03548feb-c89f-456f-97c0-dd5867c02ca1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:01:29.982836 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.982806 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/03548feb-c89f-456f-97c0-dd5867c02ca1-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"03548feb-c89f-456f-97c0-dd5867c02ca1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:01:29.983058 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.982983 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/03548feb-c89f-456f-97c0-dd5867c02ca1-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"03548feb-c89f-456f-97c0-dd5867c02ca1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:01:29.983058 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.983020 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x42dm\" (UniqueName: \"kubernetes.io/projected/03548feb-c89f-456f-97c0-dd5867c02ca1-kube-api-access-x42dm\") pod \"alertmanager-main-0\" (UID: \"03548feb-c89f-456f-97c0-dd5867c02ca1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:01:29.983159 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.983062 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/03548feb-c89f-456f-97c0-dd5867c02ca1-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"03548feb-c89f-456f-97c0-dd5867c02ca1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:01:29.983159 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.983099 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/03548feb-c89f-456f-97c0-dd5867c02ca1-tls-assets\") pod \"alertmanager-main-0\" (UID: \"03548feb-c89f-456f-97c0-dd5867c02ca1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:01:29.983159 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.983125 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/03548feb-c89f-456f-97c0-dd5867c02ca1-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"03548feb-c89f-456f-97c0-dd5867c02ca1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:01:29.983159 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.983153 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/03548feb-c89f-456f-97c0-dd5867c02ca1-config-out\") pod \"alertmanager-main-0\" (UID: \"03548feb-c89f-456f-97c0-dd5867c02ca1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:01:29.983395 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.983199 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03548feb-c89f-456f-97c0-dd5867c02ca1-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"03548feb-c89f-456f-97c0-dd5867c02ca1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:01:29.983395 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.983232 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/03548feb-c89f-456f-97c0-dd5867c02ca1-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"03548feb-c89f-456f-97c0-dd5867c02ca1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:01:29.983395 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.983264 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/03548feb-c89f-456f-97c0-dd5867c02ca1-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"03548feb-c89f-456f-97c0-dd5867c02ca1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:01:29.983395 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.983307 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/03548feb-c89f-456f-97c0-dd5867c02ca1-config-volume\") pod \"alertmanager-main-0\" (UID: \"03548feb-c89f-456f-97c0-dd5867c02ca1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:01:29.983703 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.983679 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/03548feb-c89f-456f-97c0-dd5867c02ca1-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"03548feb-c89f-456f-97c0-dd5867c02ca1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:01:29.983863 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.983836 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/03548feb-c89f-456f-97c0-dd5867c02ca1-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"03548feb-c89f-456f-97c0-dd5867c02ca1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:01:29.984328 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.984297 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03548feb-c89f-456f-97c0-dd5867c02ca1-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"03548feb-c89f-456f-97c0-dd5867c02ca1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:01:29.986110 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.985883 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/03548feb-c89f-456f-97c0-dd5867c02ca1-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"03548feb-c89f-456f-97c0-dd5867c02ca1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:01:29.986110 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.985978 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/03548feb-c89f-456f-97c0-dd5867c02ca1-web-config\") pod \"alertmanager-main-0\" (UID: \"03548feb-c89f-456f-97c0-dd5867c02ca1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:01:29.986110 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.986049 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/03548feb-c89f-456f-97c0-dd5867c02ca1-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"03548feb-c89f-456f-97c0-dd5867c02ca1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:01:29.986110 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.986100 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/03548feb-c89f-456f-97c0-dd5867c02ca1-tls-assets\") pod \"alertmanager-main-0\" (UID: \"03548feb-c89f-456f-97c0-dd5867c02ca1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:01:29.986345 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.986196 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/03548feb-c89f-456f-97c0-dd5867c02ca1-config-volume\") pod \"alertmanager-main-0\" (UID: \"03548feb-c89f-456f-97c0-dd5867c02ca1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:01:29.986345 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.986266 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/03548feb-c89f-456f-97c0-dd5867c02ca1-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"03548feb-c89f-456f-97c0-dd5867c02ca1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:01:29.986589 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.986569 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/03548feb-c89f-456f-97c0-dd5867c02ca1-config-out\") pod \"alertmanager-main-0\" (UID: \"03548feb-c89f-456f-97c0-dd5867c02ca1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:01:29.986654 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.986638 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/03548feb-c89f-456f-97c0-dd5867c02ca1-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"03548feb-c89f-456f-97c0-dd5867c02ca1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:01:29.987870 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.987854 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/03548feb-c89f-456f-97c0-dd5867c02ca1-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"03548feb-c89f-456f-97c0-dd5867c02ca1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:01:29.990655 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:29.990639 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x42dm\" (UniqueName: \"kubernetes.io/projected/03548feb-c89f-456f-97c0-dd5867c02ca1-kube-api-access-x42dm\") pod \"alertmanager-main-0\" (UID: \"03548feb-c89f-456f-97c0-dd5867c02ca1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:01:30.081856 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:30.081805 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:01:30.204346 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:30.204321 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 04:01:30.206484 ip-10-0-134-136 kubenswrapper[2578]: W0421 04:01:30.206440 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03548feb_c89f_456f_97c0_dd5867c02ca1.slice/crio-ec2fc56d3cfc6dc4b8e7efbe86b669e5c7f8691329e17b0e07e54b7a1f99d9d8 WatchSource:0}: Error finding container ec2fc56d3cfc6dc4b8e7efbe86b669e5c7f8691329e17b0e07e54b7a1f99d9d8: Status 404 returned error can't find the container with id ec2fc56d3cfc6dc4b8e7efbe86b669e5c7f8691329e17b0e07e54b7a1f99d9d8 Apr 21 04:01:30.714636 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:30.714593 2578 generic.go:358] "Generic (PLEG): container finished" podID="03548feb-c89f-456f-97c0-dd5867c02ca1" containerID="b965557921d266729f285b3001f592f43fa7d918aa79c2dc1e9d6276edfc4e5f" exitCode=0 Apr 21 04:01:30.715108 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:30.714675 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"03548feb-c89f-456f-97c0-dd5867c02ca1","Type":"ContainerDied","Data":"b965557921d266729f285b3001f592f43fa7d918aa79c2dc1e9d6276edfc4e5f"} Apr 21 04:01:30.715108 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:30.714714 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"03548feb-c89f-456f-97c0-dd5867c02ca1","Type":"ContainerStarted","Data":"ec2fc56d3cfc6dc4b8e7efbe86b669e5c7f8691329e17b0e07e54b7a1f99d9d8"} Apr 21 04:01:30.928910 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:30.928876 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d40cad4f-6109-4381-b7f5-2a037fc43ff8" path="/var/lib/kubelet/pods/d40cad4f-6109-4381-b7f5-2a037fc43ff8/volumes" Apr 21 04:01:31.720946 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:31.720908 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"03548feb-c89f-456f-97c0-dd5867c02ca1","Type":"ContainerStarted","Data":"34b99f78109b6dbf7d4045d11a194dd7b38a85147f3b6235ad4a90ac66cb2ae3"} Apr 21 04:01:31.720946 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:31.720947 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"03548feb-c89f-456f-97c0-dd5867c02ca1","Type":"ContainerStarted","Data":"c1824daaaeef0fc1ef1593d2865a433e0bba8eebec3e559dfbac28e410508650"} Apr 21 04:01:31.721440 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:31.720959 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"03548feb-c89f-456f-97c0-dd5867c02ca1","Type":"ContainerStarted","Data":"26ddc06767fb190b8e8b7f0875408ac5e7fd1907fbadea9f9e82d26bb9ef3451"} Apr 21 04:01:31.721440 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:31.720974 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"03548feb-c89f-456f-97c0-dd5867c02ca1","Type":"ContainerStarted","Data":"91323c5a3cda3adfa436e1be83d6caf163acf8f757dc1bcd5684bacacaa66b6e"} Apr 21 04:01:31.721440 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:31.720984 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"03548feb-c89f-456f-97c0-dd5867c02ca1","Type":"ContainerStarted","Data":"4df9f842a97cbc42efa715f4eb4d18578d61547916beee691e3a0fa69e9e3350"} Apr 21 04:01:31.721440 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:31.720995 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"03548feb-c89f-456f-97c0-dd5867c02ca1","Type":"ContainerStarted","Data":"98ca2e0c2857c34ddb50f7e1a67643734c243df9cdbf0db0e49d85d289325ec2"} Apr 21 04:01:31.747134 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:31.747078 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.747061752 podStartE2EDuration="2.747061752s" podCreationTimestamp="2026-04-21 04:01:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:01:31.745353037 +0000 UTC m=+235.414792136" watchObservedRunningTime="2026-04-21 04:01:31.747061752 +0000 UTC m=+235.416500827" Apr 21 04:01:32.295290 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:32.295255 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-97bc57f6c-5m8lx"] Apr 21 04:01:32.299040 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:32.298870 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-97bc57f6c-5m8lx" Apr 21 04:01:32.302374 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:32.302333 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 21 04:01:32.302791 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:32.302774 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 21 04:01:32.302791 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:32.302781 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 21 04:01:32.303292 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:32.303257 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 21 04:01:32.303539 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:32.303520 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 21 04:01:32.303743 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:32.303726 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-bb78p\"" Apr 21 04:01:32.315168 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:32.315146 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 21 04:01:32.320869 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:32.320845 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-97bc57f6c-5m8lx"] Apr 21 04:01:32.405235 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:32.405198 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/67d68952-11d7-4886-b63f-c39a5c3ef6d9-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-97bc57f6c-5m8lx\" (UID: \"67d68952-11d7-4886-b63f-c39a5c3ef6d9\") " pod="openshift-monitoring/telemeter-client-97bc57f6c-5m8lx" Apr 21 04:01:32.405421 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:32.405243 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/67d68952-11d7-4886-b63f-c39a5c3ef6d9-secret-telemeter-client\") pod \"telemeter-client-97bc57f6c-5m8lx\" (UID: \"67d68952-11d7-4886-b63f-c39a5c3ef6d9\") " pod="openshift-monitoring/telemeter-client-97bc57f6c-5m8lx" Apr 21 04:01:32.405421 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:32.405265 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67d68952-11d7-4886-b63f-c39a5c3ef6d9-telemeter-trusted-ca-bundle\") pod \"telemeter-client-97bc57f6c-5m8lx\" (UID: \"67d68952-11d7-4886-b63f-c39a5c3ef6d9\") " pod="openshift-monitoring/telemeter-client-97bc57f6c-5m8lx" Apr 21 04:01:32.405421 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:32.405380 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/67d68952-11d7-4886-b63f-c39a5c3ef6d9-federate-client-tls\") pod \"telemeter-client-97bc57f6c-5m8lx\" (UID: \"67d68952-11d7-4886-b63f-c39a5c3ef6d9\") " pod="openshift-monitoring/telemeter-client-97bc57f6c-5m8lx" Apr 21 04:01:32.405576 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:32.405476 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2hmz\" (UniqueName: \"kubernetes.io/projected/67d68952-11d7-4886-b63f-c39a5c3ef6d9-kube-api-access-m2hmz\") pod \"telemeter-client-97bc57f6c-5m8lx\" (UID: \"67d68952-11d7-4886-b63f-c39a5c3ef6d9\") " pod="openshift-monitoring/telemeter-client-97bc57f6c-5m8lx" Apr 21 04:01:32.405576 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:32.405530 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67d68952-11d7-4886-b63f-c39a5c3ef6d9-serving-certs-ca-bundle\") pod \"telemeter-client-97bc57f6c-5m8lx\" (UID: \"67d68952-11d7-4886-b63f-c39a5c3ef6d9\") " pod="openshift-monitoring/telemeter-client-97bc57f6c-5m8lx" Apr 21 04:01:32.405576 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:32.405554 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/67d68952-11d7-4886-b63f-c39a5c3ef6d9-telemeter-client-tls\") pod \"telemeter-client-97bc57f6c-5m8lx\" (UID: \"67d68952-11d7-4886-b63f-c39a5c3ef6d9\") " pod="openshift-monitoring/telemeter-client-97bc57f6c-5m8lx" Apr 21 04:01:32.405576 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:32.405568 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/67d68952-11d7-4886-b63f-c39a5c3ef6d9-metrics-client-ca\") pod \"telemeter-client-97bc57f6c-5m8lx\" (UID: \"67d68952-11d7-4886-b63f-c39a5c3ef6d9\") " pod="openshift-monitoring/telemeter-client-97bc57f6c-5m8lx" Apr 21 04:01:32.507195 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:32.506696 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/67d68952-11d7-4886-b63f-c39a5c3ef6d9-federate-client-tls\") pod \"telemeter-client-97bc57f6c-5m8lx\" (UID: \"67d68952-11d7-4886-b63f-c39a5c3ef6d9\") " pod="openshift-monitoring/telemeter-client-97bc57f6c-5m8lx" Apr 21 04:01:32.507398 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:32.507266 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m2hmz\" (UniqueName: \"kubernetes.io/projected/67d68952-11d7-4886-b63f-c39a5c3ef6d9-kube-api-access-m2hmz\") pod \"telemeter-client-97bc57f6c-5m8lx\" (UID: \"67d68952-11d7-4886-b63f-c39a5c3ef6d9\") " pod="openshift-monitoring/telemeter-client-97bc57f6c-5m8lx" Apr 21 04:01:32.507398 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:32.507347 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67d68952-11d7-4886-b63f-c39a5c3ef6d9-serving-certs-ca-bundle\") pod \"telemeter-client-97bc57f6c-5m8lx\" (UID: \"67d68952-11d7-4886-b63f-c39a5c3ef6d9\") " pod="openshift-monitoring/telemeter-client-97bc57f6c-5m8lx" Apr 21 04:01:32.507398 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:32.507381 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/67d68952-11d7-4886-b63f-c39a5c3ef6d9-telemeter-client-tls\") pod \"telemeter-client-97bc57f6c-5m8lx\" (UID: \"67d68952-11d7-4886-b63f-c39a5c3ef6d9\") " pod="openshift-monitoring/telemeter-client-97bc57f6c-5m8lx" Apr 21 04:01:32.507568 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:32.507404 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/67d68952-11d7-4886-b63f-c39a5c3ef6d9-metrics-client-ca\") pod \"telemeter-client-97bc57f6c-5m8lx\" (UID: \"67d68952-11d7-4886-b63f-c39a5c3ef6d9\") " pod="openshift-monitoring/telemeter-client-97bc57f6c-5m8lx" Apr 21 04:01:32.507568 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:32.507437 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/67d68952-11d7-4886-b63f-c39a5c3ef6d9-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-97bc57f6c-5m8lx\" (UID: \"67d68952-11d7-4886-b63f-c39a5c3ef6d9\") " pod="openshift-monitoring/telemeter-client-97bc57f6c-5m8lx" Apr 21 04:01:32.507568 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:32.507481 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/67d68952-11d7-4886-b63f-c39a5c3ef6d9-secret-telemeter-client\") pod \"telemeter-client-97bc57f6c-5m8lx\" (UID: \"67d68952-11d7-4886-b63f-c39a5c3ef6d9\") " pod="openshift-monitoring/telemeter-client-97bc57f6c-5m8lx" Apr 21 04:01:32.507568 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:32.507512 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67d68952-11d7-4886-b63f-c39a5c3ef6d9-telemeter-trusted-ca-bundle\") pod \"telemeter-client-97bc57f6c-5m8lx\" (UID: \"67d68952-11d7-4886-b63f-c39a5c3ef6d9\") " pod="openshift-monitoring/telemeter-client-97bc57f6c-5m8lx" Apr 21 04:01:32.508466 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:32.508437 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67d68952-11d7-4886-b63f-c39a5c3ef6d9-telemeter-trusted-ca-bundle\") pod \"telemeter-client-97bc57f6c-5m8lx\" (UID: \"67d68952-11d7-4886-b63f-c39a5c3ef6d9\") " pod="openshift-monitoring/telemeter-client-97bc57f6c-5m8lx" Apr 21 04:01:32.508834 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:32.508806 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67d68952-11d7-4886-b63f-c39a5c3ef6d9-serving-certs-ca-bundle\") pod \"telemeter-client-97bc57f6c-5m8lx\" (UID: \"67d68952-11d7-4886-b63f-c39a5c3ef6d9\") " pod="openshift-monitoring/telemeter-client-97bc57f6c-5m8lx" Apr 21 04:01:32.508931 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:32.508878 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/67d68952-11d7-4886-b63f-c39a5c3ef6d9-metrics-client-ca\") pod \"telemeter-client-97bc57f6c-5m8lx\" (UID: \"67d68952-11d7-4886-b63f-c39a5c3ef6d9\") " pod="openshift-monitoring/telemeter-client-97bc57f6c-5m8lx" Apr 21 04:01:32.510708 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:32.510684 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/67d68952-11d7-4886-b63f-c39a5c3ef6d9-telemeter-client-tls\") pod \"telemeter-client-97bc57f6c-5m8lx\" (UID: \"67d68952-11d7-4886-b63f-c39a5c3ef6d9\") " pod="openshift-monitoring/telemeter-client-97bc57f6c-5m8lx" Apr 21 04:01:32.510812 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:32.510722 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/67d68952-11d7-4886-b63f-c39a5c3ef6d9-federate-client-tls\") pod \"telemeter-client-97bc57f6c-5m8lx\" (UID: \"67d68952-11d7-4886-b63f-c39a5c3ef6d9\") " pod="openshift-monitoring/telemeter-client-97bc57f6c-5m8lx" Apr 21 04:01:32.510997 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:32.510971 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/67d68952-11d7-4886-b63f-c39a5c3ef6d9-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-97bc57f6c-5m8lx\" (UID: \"67d68952-11d7-4886-b63f-c39a5c3ef6d9\") " pod="openshift-monitoring/telemeter-client-97bc57f6c-5m8lx" Apr 21 04:01:32.511056 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:32.511025 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/67d68952-11d7-4886-b63f-c39a5c3ef6d9-secret-telemeter-client\") pod \"telemeter-client-97bc57f6c-5m8lx\" (UID: \"67d68952-11d7-4886-b63f-c39a5c3ef6d9\") " pod="openshift-monitoring/telemeter-client-97bc57f6c-5m8lx" Apr 21 04:01:32.517053 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:32.517024 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2hmz\" (UniqueName: \"kubernetes.io/projected/67d68952-11d7-4886-b63f-c39a5c3ef6d9-kube-api-access-m2hmz\") pod \"telemeter-client-97bc57f6c-5m8lx\" (UID: \"67d68952-11d7-4886-b63f-c39a5c3ef6d9\") " pod="openshift-monitoring/telemeter-client-97bc57f6c-5m8lx" Apr 21 04:01:32.611889 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:32.611805 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-97bc57f6c-5m8lx" Apr 21 04:01:32.750632 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:32.750601 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-97bc57f6c-5m8lx"] Apr 21 04:01:32.753240 ip-10-0-134-136 kubenswrapper[2578]: W0421 04:01:32.753207 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67d68952_11d7_4886_b63f_c39a5c3ef6d9.slice/crio-a6385f4a14e2a8d1debde90238268ec45e6641529436f0cb5329051fd13c9e31 WatchSource:0}: Error finding container a6385f4a14e2a8d1debde90238268ec45e6641529436f0cb5329051fd13c9e31: Status 404 returned error can't find the container with id a6385f4a14e2a8d1debde90238268ec45e6641529436f0cb5329051fd13c9e31 Apr 21 04:01:33.728575 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:33.728526 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-97bc57f6c-5m8lx" event={"ID":"67d68952-11d7-4886-b63f-c39a5c3ef6d9","Type":"ContainerStarted","Data":"a6385f4a14e2a8d1debde90238268ec45e6641529436f0cb5329051fd13c9e31"} Apr 21 04:01:34.733585 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:34.733544 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-97bc57f6c-5m8lx" event={"ID":"67d68952-11d7-4886-b63f-c39a5c3ef6d9","Type":"ContainerStarted","Data":"64231793c6a0c103f1b95585c74e598343f8b9745fb761d78992ac569a9a5326"} Apr 21 04:01:34.733585 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:34.733585 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-97bc57f6c-5m8lx" event={"ID":"67d68952-11d7-4886-b63f-c39a5c3ef6d9","Type":"ContainerStarted","Data":"631d382a1a27b8a08ad3e021037552d28bb141e514b08d97cb6d10e87d52a150"} Apr 21 04:01:34.734030 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:34.733596 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-97bc57f6c-5m8lx" event={"ID":"67d68952-11d7-4886-b63f-c39a5c3ef6d9","Type":"ContainerStarted","Data":"ecd562342c889f6e0c891e3c53ec20db20c1ec8ba06d6c592a885dd6c885a01f"} Apr 21 04:01:34.754777 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:34.754718 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-97bc57f6c-5m8lx" podStartSLOduration=1.380720717 podStartE2EDuration="2.754697803s" podCreationTimestamp="2026-04-21 04:01:32 +0000 UTC" firstStartedPulling="2026-04-21 04:01:32.755295456 +0000 UTC m=+236.424734524" lastFinishedPulling="2026-04-21 04:01:34.129272557 +0000 UTC m=+237.798711610" observedRunningTime="2026-04-21 04:01:34.753399453 +0000 UTC m=+238.422838531" watchObservedRunningTime="2026-04-21 04:01:34.754697803 +0000 UTC m=+238.424136879" Apr 21 04:01:35.630764 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:35.630728 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-54fd69787c-44b7j"] Apr 21 04:01:35.634369 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:35.634347 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54fd69787c-44b7j" Apr 21 04:01:35.644519 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:35.644492 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-54fd69787c-44b7j"] Apr 21 04:01:35.738518 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:35.738484 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/249120e8-680c-425e-8ee1-eccf87943943-console-config\") pod \"console-54fd69787c-44b7j\" (UID: \"249120e8-680c-425e-8ee1-eccf87943943\") " pod="openshift-console/console-54fd69787c-44b7j" Apr 21 04:01:35.738912 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:35.738537 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/249120e8-680c-425e-8ee1-eccf87943943-console-oauth-config\") pod \"console-54fd69787c-44b7j\" (UID: \"249120e8-680c-425e-8ee1-eccf87943943\") " pod="openshift-console/console-54fd69787c-44b7j" Apr 21 04:01:35.738912 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:35.738557 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlxww\" (UniqueName: \"kubernetes.io/projected/249120e8-680c-425e-8ee1-eccf87943943-kube-api-access-xlxww\") pod \"console-54fd69787c-44b7j\" (UID: \"249120e8-680c-425e-8ee1-eccf87943943\") " pod="openshift-console/console-54fd69787c-44b7j" Apr 21 04:01:35.738912 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:35.738616 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/249120e8-680c-425e-8ee1-eccf87943943-service-ca\") pod \"console-54fd69787c-44b7j\" (UID: \"249120e8-680c-425e-8ee1-eccf87943943\") " pod="openshift-console/console-54fd69787c-44b7j" Apr 21 04:01:35.738912 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:35.738659 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/249120e8-680c-425e-8ee1-eccf87943943-console-serving-cert\") pod \"console-54fd69787c-44b7j\" (UID: \"249120e8-680c-425e-8ee1-eccf87943943\") " pod="openshift-console/console-54fd69787c-44b7j" Apr 21 04:01:35.738912 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:35.738690 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/249120e8-680c-425e-8ee1-eccf87943943-trusted-ca-bundle\") pod \"console-54fd69787c-44b7j\" (UID: \"249120e8-680c-425e-8ee1-eccf87943943\") " pod="openshift-console/console-54fd69787c-44b7j" Apr 21 04:01:35.738912 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:35.738713 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/249120e8-680c-425e-8ee1-eccf87943943-oauth-serving-cert\") pod \"console-54fd69787c-44b7j\" (UID: \"249120e8-680c-425e-8ee1-eccf87943943\") " pod="openshift-console/console-54fd69787c-44b7j" Apr 21 04:01:35.839478 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:35.839437 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/249120e8-680c-425e-8ee1-eccf87943943-console-oauth-config\") pod \"console-54fd69787c-44b7j\" (UID: \"249120e8-680c-425e-8ee1-eccf87943943\") " pod="openshift-console/console-54fd69787c-44b7j" Apr 21 04:01:35.839478 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:35.839479 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xlxww\" (UniqueName: \"kubernetes.io/projected/249120e8-680c-425e-8ee1-eccf87943943-kube-api-access-xlxww\") pod \"console-54fd69787c-44b7j\" (UID: \"249120e8-680c-425e-8ee1-eccf87943943\") " pod="openshift-console/console-54fd69787c-44b7j" Apr 21 04:01:35.839741 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:35.839512 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/249120e8-680c-425e-8ee1-eccf87943943-service-ca\") pod \"console-54fd69787c-44b7j\" (UID: \"249120e8-680c-425e-8ee1-eccf87943943\") " pod="openshift-console/console-54fd69787c-44b7j" Apr 21 04:01:35.839741 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:35.839559 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/249120e8-680c-425e-8ee1-eccf87943943-console-serving-cert\") pod \"console-54fd69787c-44b7j\" (UID: \"249120e8-680c-425e-8ee1-eccf87943943\") " pod="openshift-console/console-54fd69787c-44b7j" Apr 21 04:01:35.839861 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:35.839840 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/249120e8-680c-425e-8ee1-eccf87943943-trusted-ca-bundle\") pod \"console-54fd69787c-44b7j\" (UID: \"249120e8-680c-425e-8ee1-eccf87943943\") " pod="openshift-console/console-54fd69787c-44b7j" Apr 21 04:01:35.839933 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:35.839887 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/249120e8-680c-425e-8ee1-eccf87943943-oauth-serving-cert\") pod \"console-54fd69787c-44b7j\" (UID: \"249120e8-680c-425e-8ee1-eccf87943943\") " pod="openshift-console/console-54fd69787c-44b7j" Apr 21 04:01:35.840019 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:35.839999 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/249120e8-680c-425e-8ee1-eccf87943943-console-config\") pod \"console-54fd69787c-44b7j\" (UID: \"249120e8-680c-425e-8ee1-eccf87943943\") " pod="openshift-console/console-54fd69787c-44b7j" Apr 21 04:01:35.840417 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:35.840390 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/249120e8-680c-425e-8ee1-eccf87943943-service-ca\") pod \"console-54fd69787c-44b7j\" (UID: \"249120e8-680c-425e-8ee1-eccf87943943\") " pod="openshift-console/console-54fd69787c-44b7j" Apr 21 04:01:35.840664 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:35.840644 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/249120e8-680c-425e-8ee1-eccf87943943-oauth-serving-cert\") pod \"console-54fd69787c-44b7j\" (UID: \"249120e8-680c-425e-8ee1-eccf87943943\") " pod="openshift-console/console-54fd69787c-44b7j" Apr 21 04:01:35.840861 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:35.840840 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/249120e8-680c-425e-8ee1-eccf87943943-console-config\") pod \"console-54fd69787c-44b7j\" (UID: \"249120e8-680c-425e-8ee1-eccf87943943\") " pod="openshift-console/console-54fd69787c-44b7j" Apr 21 04:01:35.841049 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:35.841032 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/249120e8-680c-425e-8ee1-eccf87943943-trusted-ca-bundle\") pod \"console-54fd69787c-44b7j\" (UID: \"249120e8-680c-425e-8ee1-eccf87943943\") " pod="openshift-console/console-54fd69787c-44b7j" Apr 21 04:01:35.842181 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:35.842153 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/249120e8-680c-425e-8ee1-eccf87943943-console-serving-cert\") pod \"console-54fd69787c-44b7j\" (UID: \"249120e8-680c-425e-8ee1-eccf87943943\") " pod="openshift-console/console-54fd69787c-44b7j" Apr 21 04:01:35.842181 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:35.842173 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/249120e8-680c-425e-8ee1-eccf87943943-console-oauth-config\") pod \"console-54fd69787c-44b7j\" (UID: \"249120e8-680c-425e-8ee1-eccf87943943\") " pod="openshift-console/console-54fd69787c-44b7j" Apr 21 04:01:35.847780 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:35.847761 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlxww\" (UniqueName: \"kubernetes.io/projected/249120e8-680c-425e-8ee1-eccf87943943-kube-api-access-xlxww\") pod \"console-54fd69787c-44b7j\" (UID: \"249120e8-680c-425e-8ee1-eccf87943943\") " pod="openshift-console/console-54fd69787c-44b7j" Apr 21 04:01:35.944979 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:35.944875 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54fd69787c-44b7j" Apr 21 04:01:36.069322 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:36.069269 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-54fd69787c-44b7j"] Apr 21 04:01:36.072452 ip-10-0-134-136 kubenswrapper[2578]: W0421 04:01:36.072421 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod249120e8_680c_425e_8ee1_eccf87943943.slice/crio-a940574be81010657900d1756100887e4d874bc8cd9ae402f3751e8598c4371e WatchSource:0}: Error finding container a940574be81010657900d1756100887e4d874bc8cd9ae402f3751e8598c4371e: Status 404 returned error can't find the container with id a940574be81010657900d1756100887e4d874bc8cd9ae402f3751e8598c4371e Apr 21 04:01:36.742031 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:36.741990 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54fd69787c-44b7j" event={"ID":"249120e8-680c-425e-8ee1-eccf87943943","Type":"ContainerStarted","Data":"645b56f1b0f005e8aeaeb35d8e40fb4040f22bb45766a3b7f0f5c0d26e60ecba"} Apr 21 04:01:36.742031 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:36.742032 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54fd69787c-44b7j" event={"ID":"249120e8-680c-425e-8ee1-eccf87943943","Type":"ContainerStarted","Data":"a940574be81010657900d1756100887e4d874bc8cd9ae402f3751e8598c4371e"} Apr 21 04:01:36.759466 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:36.759419 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-54fd69787c-44b7j" podStartSLOduration=1.759403302 podStartE2EDuration="1.759403302s" podCreationTimestamp="2026-04-21 04:01:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:01:36.757757256 +0000 UTC m=+240.427196341" watchObservedRunningTime="2026-04-21 04:01:36.759403302 +0000 UTC m=+240.428842376" Apr 21 04:01:45.945448 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:45.945397 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-54fd69787c-44b7j" Apr 21 04:01:45.945448 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:45.945460 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-54fd69787c-44b7j" Apr 21 04:01:45.950182 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:45.950157 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-54fd69787c-44b7j" Apr 21 04:01:46.776916 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:46.776885 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-54fd69787c-44b7j" Apr 21 04:01:46.819176 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:46.819141 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-cdfc4cb68-9k284"] Apr 21 04:01:47.643124 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:47.643022 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6e98b17f-4794-44aa-8756-58a9bd9cb37a-metrics-certs\") pod \"network-metrics-daemon-2lrq9\" (UID: \"6e98b17f-4794-44aa-8756-58a9bd9cb37a\") " pod="openshift-multus/network-metrics-daemon-2lrq9" Apr 21 04:01:47.645487 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:47.645461 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6e98b17f-4794-44aa-8756-58a9bd9cb37a-metrics-certs\") pod \"network-metrics-daemon-2lrq9\" (UID: \"6e98b17f-4794-44aa-8756-58a9bd9cb37a\") " pod="openshift-multus/network-metrics-daemon-2lrq9" Apr 21 04:01:47.932643 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:47.932561 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-k29s5\"" Apr 21 04:01:47.940097 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:47.940069 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2lrq9" Apr 21 04:01:48.064158 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:48.064122 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2lrq9"] Apr 21 04:01:48.067132 ip-10-0-134-136 kubenswrapper[2578]: W0421 04:01:48.067100 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e98b17f_4794_44aa_8756_58a9bd9cb37a.slice/crio-9af5bc1f65a3b21644402436d820890c5de359a0592a17558b0208ce0bc1b84e WatchSource:0}: Error finding container 9af5bc1f65a3b21644402436d820890c5de359a0592a17558b0208ce0bc1b84e: Status 404 returned error can't find the container with id 9af5bc1f65a3b21644402436d820890c5de359a0592a17558b0208ce0bc1b84e Apr 21 04:01:48.781077 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:48.781039 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2lrq9" event={"ID":"6e98b17f-4794-44aa-8756-58a9bd9cb37a","Type":"ContainerStarted","Data":"9af5bc1f65a3b21644402436d820890c5de359a0592a17558b0208ce0bc1b84e"} Apr 21 04:01:49.785245 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:49.785207 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2lrq9" event={"ID":"6e98b17f-4794-44aa-8756-58a9bd9cb37a","Type":"ContainerStarted","Data":"2d3cad1480c5058c922e3cf5fc8cb785d27a488da118f57b2b8d1c1b7d12dda3"} Apr 21 04:01:49.785245 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:49.785246 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2lrq9" event={"ID":"6e98b17f-4794-44aa-8756-58a9bd9cb37a","Type":"ContainerStarted","Data":"22a8b7348fb8d2b78416cd1a568407ce1f6f1813063ce2264c925398fdc9b5cc"} Apr 21 04:01:49.800325 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:01:49.800249 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-2lrq9" podStartSLOduration=252.873523049 podStartE2EDuration="4m13.800232095s" podCreationTimestamp="2026-04-21 03:57:36 +0000 UTC" firstStartedPulling="2026-04-21 04:01:48.069062961 +0000 UTC m=+251.738502014" lastFinishedPulling="2026-04-21 04:01:48.995772001 +0000 UTC m=+252.665211060" observedRunningTime="2026-04-21 04:01:49.800139674 +0000 UTC m=+253.469578750" watchObservedRunningTime="2026-04-21 04:01:49.800232095 +0000 UTC m=+253.469671166" Apr 21 04:02:11.842634 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:11.842593 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-cdfc4cb68-9k284" podUID="0542ee55-ebd2-4c1f-888f-a044bc8cc7dc" containerName="console" containerID="cri-o://cf47a063c8fa3bb3aec3c346b79b1e2f4185d05ed41135c23b1ede8b00e76f2f" gracePeriod=15 Apr 21 04:02:12.080594 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:12.080571 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-cdfc4cb68-9k284_0542ee55-ebd2-4c1f-888f-a044bc8cc7dc/console/0.log" Apr 21 04:02:12.080739 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:12.080648 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cdfc4cb68-9k284" Apr 21 04:02:12.257569 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:12.257474 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0542ee55-ebd2-4c1f-888f-a044bc8cc7dc-console-oauth-config\") pod \"0542ee55-ebd2-4c1f-888f-a044bc8cc7dc\" (UID: \"0542ee55-ebd2-4c1f-888f-a044bc8cc7dc\") " Apr 21 04:02:12.257569 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:12.257526 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4f9sp\" (UniqueName: \"kubernetes.io/projected/0542ee55-ebd2-4c1f-888f-a044bc8cc7dc-kube-api-access-4f9sp\") pod \"0542ee55-ebd2-4c1f-888f-a044bc8cc7dc\" (UID: \"0542ee55-ebd2-4c1f-888f-a044bc8cc7dc\") " Apr 21 04:02:12.257569 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:12.257553 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0542ee55-ebd2-4c1f-888f-a044bc8cc7dc-trusted-ca-bundle\") pod \"0542ee55-ebd2-4c1f-888f-a044bc8cc7dc\" (UID: \"0542ee55-ebd2-4c1f-888f-a044bc8cc7dc\") " Apr 21 04:02:12.257854 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:12.257595 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0542ee55-ebd2-4c1f-888f-a044bc8cc7dc-oauth-serving-cert\") pod \"0542ee55-ebd2-4c1f-888f-a044bc8cc7dc\" (UID: \"0542ee55-ebd2-4c1f-888f-a044bc8cc7dc\") " Apr 21 04:02:12.257854 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:12.257620 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0542ee55-ebd2-4c1f-888f-a044bc8cc7dc-console-serving-cert\") pod \"0542ee55-ebd2-4c1f-888f-a044bc8cc7dc\" (UID: \"0542ee55-ebd2-4c1f-888f-a044bc8cc7dc\") " Apr 21 04:02:12.257854 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:12.257655 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0542ee55-ebd2-4c1f-888f-a044bc8cc7dc-console-config\") pod \"0542ee55-ebd2-4c1f-888f-a044bc8cc7dc\" (UID: \"0542ee55-ebd2-4c1f-888f-a044bc8cc7dc\") " Apr 21 04:02:12.257854 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:12.257703 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0542ee55-ebd2-4c1f-888f-a044bc8cc7dc-service-ca\") pod \"0542ee55-ebd2-4c1f-888f-a044bc8cc7dc\" (UID: \"0542ee55-ebd2-4c1f-888f-a044bc8cc7dc\") " Apr 21 04:02:12.258183 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:12.258153 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0542ee55-ebd2-4c1f-888f-a044bc8cc7dc-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "0542ee55-ebd2-4c1f-888f-a044bc8cc7dc" (UID: "0542ee55-ebd2-4c1f-888f-a044bc8cc7dc"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:02:12.258183 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:12.258084 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0542ee55-ebd2-4c1f-888f-a044bc8cc7dc-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0542ee55-ebd2-4c1f-888f-a044bc8cc7dc" (UID: "0542ee55-ebd2-4c1f-888f-a044bc8cc7dc"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:02:12.258371 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:12.258194 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0542ee55-ebd2-4c1f-888f-a044bc8cc7dc-console-config" (OuterVolumeSpecName: "console-config") pod "0542ee55-ebd2-4c1f-888f-a044bc8cc7dc" (UID: "0542ee55-ebd2-4c1f-888f-a044bc8cc7dc"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:02:12.258371 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:12.258267 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0542ee55-ebd2-4c1f-888f-a044bc8cc7dc-service-ca" (OuterVolumeSpecName: "service-ca") pod "0542ee55-ebd2-4c1f-888f-a044bc8cc7dc" (UID: "0542ee55-ebd2-4c1f-888f-a044bc8cc7dc"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:02:12.259771 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:12.259746 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0542ee55-ebd2-4c1f-888f-a044bc8cc7dc-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0542ee55-ebd2-4c1f-888f-a044bc8cc7dc" (UID: "0542ee55-ebd2-4c1f-888f-a044bc8cc7dc"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:02:12.259852 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:12.259793 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0542ee55-ebd2-4c1f-888f-a044bc8cc7dc-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0542ee55-ebd2-4c1f-888f-a044bc8cc7dc" (UID: "0542ee55-ebd2-4c1f-888f-a044bc8cc7dc"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:02:12.259852 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:12.259844 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0542ee55-ebd2-4c1f-888f-a044bc8cc7dc-kube-api-access-4f9sp" (OuterVolumeSpecName: "kube-api-access-4f9sp") pod "0542ee55-ebd2-4c1f-888f-a044bc8cc7dc" (UID: "0542ee55-ebd2-4c1f-888f-a044bc8cc7dc"). InnerVolumeSpecName "kube-api-access-4f9sp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:02:12.359256 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:12.359216 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4f9sp\" (UniqueName: \"kubernetes.io/projected/0542ee55-ebd2-4c1f-888f-a044bc8cc7dc-kube-api-access-4f9sp\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:02:12.359256 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:12.359249 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0542ee55-ebd2-4c1f-888f-a044bc8cc7dc-trusted-ca-bundle\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:02:12.359256 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:12.359259 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0542ee55-ebd2-4c1f-888f-a044bc8cc7dc-oauth-serving-cert\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:02:12.359256 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:12.359269 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0542ee55-ebd2-4c1f-888f-a044bc8cc7dc-console-serving-cert\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:02:12.359633 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:12.359335 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0542ee55-ebd2-4c1f-888f-a044bc8cc7dc-console-config\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:02:12.359633 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:12.359351 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0542ee55-ebd2-4c1f-888f-a044bc8cc7dc-service-ca\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:02:12.359633 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:12.359360 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0542ee55-ebd2-4c1f-888f-a044bc8cc7dc-console-oauth-config\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:02:12.860710 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:12.860684 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-cdfc4cb68-9k284_0542ee55-ebd2-4c1f-888f-a044bc8cc7dc/console/0.log" Apr 21 04:02:12.861120 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:12.860729 2578 generic.go:358] "Generic (PLEG): container finished" podID="0542ee55-ebd2-4c1f-888f-a044bc8cc7dc" containerID="cf47a063c8fa3bb3aec3c346b79b1e2f4185d05ed41135c23b1ede8b00e76f2f" exitCode=2 Apr 21 04:02:12.861120 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:12.860775 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cdfc4cb68-9k284" event={"ID":"0542ee55-ebd2-4c1f-888f-a044bc8cc7dc","Type":"ContainerDied","Data":"cf47a063c8fa3bb3aec3c346b79b1e2f4185d05ed41135c23b1ede8b00e76f2f"} Apr 21 04:02:12.861120 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:12.860804 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cdfc4cb68-9k284" event={"ID":"0542ee55-ebd2-4c1f-888f-a044bc8cc7dc","Type":"ContainerDied","Data":"1765514563bfd08b79df0dc5c745eaecd1f1c84cacb138c68c8ce828e332c2bf"} Apr 21 04:02:12.861120 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:12.860808 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cdfc4cb68-9k284" Apr 21 04:02:12.861120 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:12.860823 2578 scope.go:117] "RemoveContainer" containerID="cf47a063c8fa3bb3aec3c346b79b1e2f4185d05ed41135c23b1ede8b00e76f2f" Apr 21 04:02:12.871941 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:12.871918 2578 scope.go:117] "RemoveContainer" containerID="cf47a063c8fa3bb3aec3c346b79b1e2f4185d05ed41135c23b1ede8b00e76f2f" Apr 21 04:02:12.872496 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:02:12.872471 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf47a063c8fa3bb3aec3c346b79b1e2f4185d05ed41135c23b1ede8b00e76f2f\": container with ID starting with cf47a063c8fa3bb3aec3c346b79b1e2f4185d05ed41135c23b1ede8b00e76f2f not found: ID does not exist" containerID="cf47a063c8fa3bb3aec3c346b79b1e2f4185d05ed41135c23b1ede8b00e76f2f" Apr 21 04:02:12.872606 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:12.872524 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf47a063c8fa3bb3aec3c346b79b1e2f4185d05ed41135c23b1ede8b00e76f2f"} err="failed to get container status \"cf47a063c8fa3bb3aec3c346b79b1e2f4185d05ed41135c23b1ede8b00e76f2f\": rpc error: code = NotFound desc = could not find container \"cf47a063c8fa3bb3aec3c346b79b1e2f4185d05ed41135c23b1ede8b00e76f2f\": container with ID starting with cf47a063c8fa3bb3aec3c346b79b1e2f4185d05ed41135c23b1ede8b00e76f2f not found: ID does not exist" Apr 21 04:02:12.881806 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:12.881776 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-cdfc4cb68-9k284"] Apr 21 04:02:12.885687 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:12.885660 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-cdfc4cb68-9k284"] Apr 21 04:02:12.928758 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:12.928720 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0542ee55-ebd2-4c1f-888f-a044bc8cc7dc" path="/var/lib/kubelet/pods/0542ee55-ebd2-4c1f-888f-a044bc8cc7dc/volumes" Apr 21 04:02:36.823557 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:36.823526 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-82wml_37b63820-2bf0-4a5b-82c5-6b56ab7689b7/ovn-acl-logging/0.log" Apr 21 04:02:36.823993 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:36.823701 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-82wml_37b63820-2bf0-4a5b-82c5-6b56ab7689b7/ovn-acl-logging/0.log" Apr 21 04:02:36.829699 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:36.829660 2578 kubelet.go:1628] "Image garbage collection succeeded" Apr 21 04:02:52.124304 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:52.124252 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5d49d4b6dc-66ft2"] Apr 21 04:02:52.132007 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:52.124588 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0542ee55-ebd2-4c1f-888f-a044bc8cc7dc" containerName="console" Apr 21 04:02:52.132007 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:52.124601 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="0542ee55-ebd2-4c1f-888f-a044bc8cc7dc" containerName="console" Apr 21 04:02:52.132007 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:52.124662 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="0542ee55-ebd2-4c1f-888f-a044bc8cc7dc" containerName="console" Apr 21 04:02:52.132733 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:52.132714 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d49d4b6dc-66ft2" Apr 21 04:02:52.135595 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:52.135564 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d49d4b6dc-66ft2"] Apr 21 04:02:52.200005 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:52.199965 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/801f9aa9-c09f-4657-a23f-e88496dbbdd1-oauth-serving-cert\") pod \"console-5d49d4b6dc-66ft2\" (UID: \"801f9aa9-c09f-4657-a23f-e88496dbbdd1\") " pod="openshift-console/console-5d49d4b6dc-66ft2" Apr 21 04:02:52.200005 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:52.200026 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/801f9aa9-c09f-4657-a23f-e88496dbbdd1-console-config\") pod \"console-5d49d4b6dc-66ft2\" (UID: \"801f9aa9-c09f-4657-a23f-e88496dbbdd1\") " pod="openshift-console/console-5d49d4b6dc-66ft2" Apr 21 04:02:52.200296 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:52.200056 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/801f9aa9-c09f-4657-a23f-e88496dbbdd1-trusted-ca-bundle\") pod \"console-5d49d4b6dc-66ft2\" (UID: \"801f9aa9-c09f-4657-a23f-e88496dbbdd1\") " pod="openshift-console/console-5d49d4b6dc-66ft2" Apr 21 04:02:52.200296 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:52.200172 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/801f9aa9-c09f-4657-a23f-e88496dbbdd1-console-oauth-config\") pod \"console-5d49d4b6dc-66ft2\" (UID: \"801f9aa9-c09f-4657-a23f-e88496dbbdd1\") " pod="openshift-console/console-5d49d4b6dc-66ft2" Apr 21 04:02:52.200296 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:52.200211 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqplv\" (UniqueName: \"kubernetes.io/projected/801f9aa9-c09f-4657-a23f-e88496dbbdd1-kube-api-access-gqplv\") pod \"console-5d49d4b6dc-66ft2\" (UID: \"801f9aa9-c09f-4657-a23f-e88496dbbdd1\") " pod="openshift-console/console-5d49d4b6dc-66ft2" Apr 21 04:02:52.200296 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:52.200253 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/801f9aa9-c09f-4657-a23f-e88496dbbdd1-console-serving-cert\") pod \"console-5d49d4b6dc-66ft2\" (UID: \"801f9aa9-c09f-4657-a23f-e88496dbbdd1\") " pod="openshift-console/console-5d49d4b6dc-66ft2" Apr 21 04:02:52.200580 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:52.200316 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/801f9aa9-c09f-4657-a23f-e88496dbbdd1-service-ca\") pod \"console-5d49d4b6dc-66ft2\" (UID: \"801f9aa9-c09f-4657-a23f-e88496dbbdd1\") " pod="openshift-console/console-5d49d4b6dc-66ft2" Apr 21 04:02:52.301158 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:52.301112 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/801f9aa9-c09f-4657-a23f-e88496dbbdd1-console-config\") pod \"console-5d49d4b6dc-66ft2\" (UID: \"801f9aa9-c09f-4657-a23f-e88496dbbdd1\") " pod="openshift-console/console-5d49d4b6dc-66ft2" Apr 21 04:02:52.301379 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:52.301171 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/801f9aa9-c09f-4657-a23f-e88496dbbdd1-trusted-ca-bundle\") pod \"console-5d49d4b6dc-66ft2\" (UID: \"801f9aa9-c09f-4657-a23f-e88496dbbdd1\") " pod="openshift-console/console-5d49d4b6dc-66ft2" Apr 21 04:02:52.301379 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:52.301210 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/801f9aa9-c09f-4657-a23f-e88496dbbdd1-console-oauth-config\") pod \"console-5d49d4b6dc-66ft2\" (UID: \"801f9aa9-c09f-4657-a23f-e88496dbbdd1\") " pod="openshift-console/console-5d49d4b6dc-66ft2" Apr 21 04:02:52.301379 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:52.301232 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gqplv\" (UniqueName: \"kubernetes.io/projected/801f9aa9-c09f-4657-a23f-e88496dbbdd1-kube-api-access-gqplv\") pod \"console-5d49d4b6dc-66ft2\" (UID: \"801f9aa9-c09f-4657-a23f-e88496dbbdd1\") " pod="openshift-console/console-5d49d4b6dc-66ft2" Apr 21 04:02:52.301379 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:52.301257 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/801f9aa9-c09f-4657-a23f-e88496dbbdd1-console-serving-cert\") pod \"console-5d49d4b6dc-66ft2\" (UID: \"801f9aa9-c09f-4657-a23f-e88496dbbdd1\") " pod="openshift-console/console-5d49d4b6dc-66ft2" Apr 21 04:02:52.301379 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:52.301329 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/801f9aa9-c09f-4657-a23f-e88496dbbdd1-service-ca\") pod \"console-5d49d4b6dc-66ft2\" (UID: \"801f9aa9-c09f-4657-a23f-e88496dbbdd1\") " pod="openshift-console/console-5d49d4b6dc-66ft2" Apr 21 04:02:52.301703 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:52.301402 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/801f9aa9-c09f-4657-a23f-e88496dbbdd1-oauth-serving-cert\") pod \"console-5d49d4b6dc-66ft2\" (UID: \"801f9aa9-c09f-4657-a23f-e88496dbbdd1\") " pod="openshift-console/console-5d49d4b6dc-66ft2" Apr 21 04:02:52.301990 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:52.301956 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/801f9aa9-c09f-4657-a23f-e88496dbbdd1-console-config\") pod \"console-5d49d4b6dc-66ft2\" (UID: \"801f9aa9-c09f-4657-a23f-e88496dbbdd1\") " pod="openshift-console/console-5d49d4b6dc-66ft2" Apr 21 04:02:52.302073 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:52.301999 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/801f9aa9-c09f-4657-a23f-e88496dbbdd1-oauth-serving-cert\") pod \"console-5d49d4b6dc-66ft2\" (UID: \"801f9aa9-c09f-4657-a23f-e88496dbbdd1\") " pod="openshift-console/console-5d49d4b6dc-66ft2" Apr 21 04:02:52.302073 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:52.302030 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/801f9aa9-c09f-4657-a23f-e88496dbbdd1-service-ca\") pod \"console-5d49d4b6dc-66ft2\" (UID: \"801f9aa9-c09f-4657-a23f-e88496dbbdd1\") " pod="openshift-console/console-5d49d4b6dc-66ft2" Apr 21 04:02:52.302173 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:52.302130 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/801f9aa9-c09f-4657-a23f-e88496dbbdd1-trusted-ca-bundle\") pod \"console-5d49d4b6dc-66ft2\" (UID: \"801f9aa9-c09f-4657-a23f-e88496dbbdd1\") " pod="openshift-console/console-5d49d4b6dc-66ft2" Apr 21 04:02:52.303828 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:52.303805 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/801f9aa9-c09f-4657-a23f-e88496dbbdd1-console-oauth-config\") pod \"console-5d49d4b6dc-66ft2\" (UID: \"801f9aa9-c09f-4657-a23f-e88496dbbdd1\") " pod="openshift-console/console-5d49d4b6dc-66ft2" Apr 21 04:02:52.303928 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:52.303913 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/801f9aa9-c09f-4657-a23f-e88496dbbdd1-console-serving-cert\") pod \"console-5d49d4b6dc-66ft2\" (UID: \"801f9aa9-c09f-4657-a23f-e88496dbbdd1\") " pod="openshift-console/console-5d49d4b6dc-66ft2" Apr 21 04:02:52.309724 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:52.309701 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqplv\" (UniqueName: \"kubernetes.io/projected/801f9aa9-c09f-4657-a23f-e88496dbbdd1-kube-api-access-gqplv\") pod \"console-5d49d4b6dc-66ft2\" (UID: \"801f9aa9-c09f-4657-a23f-e88496dbbdd1\") " pod="openshift-console/console-5d49d4b6dc-66ft2" Apr 21 04:02:52.443391 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:52.443257 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d49d4b6dc-66ft2" Apr 21 04:02:52.565986 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:52.565908 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d49d4b6dc-66ft2"] Apr 21 04:02:52.568446 ip-10-0-134-136 kubenswrapper[2578]: W0421 04:02:52.568411 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod801f9aa9_c09f_4657_a23f_e88496dbbdd1.slice/crio-2da1e4503eb09021ad4ee943c33b744a4f70ebae46233d474344e8c8ffc0254e WatchSource:0}: Error finding container 2da1e4503eb09021ad4ee943c33b744a4f70ebae46233d474344e8c8ffc0254e: Status 404 returned error can't find the container with id 2da1e4503eb09021ad4ee943c33b744a4f70ebae46233d474344e8c8ffc0254e Apr 21 04:02:52.570149 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:52.570133 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 04:02:52.977101 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:52.977067 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d49d4b6dc-66ft2" event={"ID":"801f9aa9-c09f-4657-a23f-e88496dbbdd1","Type":"ContainerStarted","Data":"8c271bb56c22ace11851a27965393ec0547f9d509c32b0eef3483db622db7c12"} Apr 21 04:02:52.977101 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:52.977108 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d49d4b6dc-66ft2" event={"ID":"801f9aa9-c09f-4657-a23f-e88496dbbdd1","Type":"ContainerStarted","Data":"2da1e4503eb09021ad4ee943c33b744a4f70ebae46233d474344e8c8ffc0254e"} Apr 21 04:02:52.996172 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:02:52.996109 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5d49d4b6dc-66ft2" podStartSLOduration=0.996088818 podStartE2EDuration="996.088818ms" podCreationTimestamp="2026-04-21 04:02:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:02:52.993322144 +0000 UTC m=+316.662761228" watchObservedRunningTime="2026-04-21 04:02:52.996088818 +0000 UTC m=+316.665527894" Apr 21 04:03:02.444370 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:02.444321 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5d49d4b6dc-66ft2" Apr 21 04:03:02.444370 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:02.444378 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5d49d4b6dc-66ft2" Apr 21 04:03:02.450122 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:02.450090 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5d49d4b6dc-66ft2" Apr 21 04:03:03.008540 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:03.008511 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5d49d4b6dc-66ft2" Apr 21 04:03:03.056919 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:03.056886 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-54fd69787c-44b7j"] Apr 21 04:03:28.077267 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:28.077160 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-54fd69787c-44b7j" podUID="249120e8-680c-425e-8ee1-eccf87943943" containerName="console" containerID="cri-o://645b56f1b0f005e8aeaeb35d8e40fb4040f22bb45766a3b7f0f5c0d26e60ecba" gracePeriod=15 Apr 21 04:03:28.322130 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:28.322103 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-54fd69787c-44b7j_249120e8-680c-425e-8ee1-eccf87943943/console/0.log" Apr 21 04:03:28.322270 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:28.322164 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54fd69787c-44b7j" Apr 21 04:03:28.407515 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:28.407418 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/249120e8-680c-425e-8ee1-eccf87943943-service-ca\") pod \"249120e8-680c-425e-8ee1-eccf87943943\" (UID: \"249120e8-680c-425e-8ee1-eccf87943943\") " Apr 21 04:03:28.407515 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:28.407463 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/249120e8-680c-425e-8ee1-eccf87943943-console-serving-cert\") pod \"249120e8-680c-425e-8ee1-eccf87943943\" (UID: \"249120e8-680c-425e-8ee1-eccf87943943\") " Apr 21 04:03:28.407807 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:28.407545 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/249120e8-680c-425e-8ee1-eccf87943943-console-config\") pod \"249120e8-680c-425e-8ee1-eccf87943943\" (UID: \"249120e8-680c-425e-8ee1-eccf87943943\") " Apr 21 04:03:28.407807 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:28.407568 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/249120e8-680c-425e-8ee1-eccf87943943-console-oauth-config\") pod \"249120e8-680c-425e-8ee1-eccf87943943\" (UID: \"249120e8-680c-425e-8ee1-eccf87943943\") " Apr 21 04:03:28.407807 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:28.407596 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/249120e8-680c-425e-8ee1-eccf87943943-oauth-serving-cert\") pod \"249120e8-680c-425e-8ee1-eccf87943943\" (UID: \"249120e8-680c-425e-8ee1-eccf87943943\") " Apr 21 04:03:28.407807 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:28.407622 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlxww\" (UniqueName: \"kubernetes.io/projected/249120e8-680c-425e-8ee1-eccf87943943-kube-api-access-xlxww\") pod \"249120e8-680c-425e-8ee1-eccf87943943\" (UID: \"249120e8-680c-425e-8ee1-eccf87943943\") " Apr 21 04:03:28.407807 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:28.407685 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/249120e8-680c-425e-8ee1-eccf87943943-trusted-ca-bundle\") pod \"249120e8-680c-425e-8ee1-eccf87943943\" (UID: \"249120e8-680c-425e-8ee1-eccf87943943\") " Apr 21 04:03:28.408051 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:28.407843 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/249120e8-680c-425e-8ee1-eccf87943943-service-ca" (OuterVolumeSpecName: "service-ca") pod "249120e8-680c-425e-8ee1-eccf87943943" (UID: "249120e8-680c-425e-8ee1-eccf87943943"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:03:28.408111 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:28.408071 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/249120e8-680c-425e-8ee1-eccf87943943-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "249120e8-680c-425e-8ee1-eccf87943943" (UID: "249120e8-680c-425e-8ee1-eccf87943943"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:03:28.408111 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:28.408087 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/249120e8-680c-425e-8ee1-eccf87943943-service-ca\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:03:28.408207 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:28.408102 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/249120e8-680c-425e-8ee1-eccf87943943-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "249120e8-680c-425e-8ee1-eccf87943943" (UID: "249120e8-680c-425e-8ee1-eccf87943943"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:03:28.408207 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:28.408148 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/249120e8-680c-425e-8ee1-eccf87943943-console-config" (OuterVolumeSpecName: "console-config") pod "249120e8-680c-425e-8ee1-eccf87943943" (UID: "249120e8-680c-425e-8ee1-eccf87943943"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:03:28.409805 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:28.409778 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/249120e8-680c-425e-8ee1-eccf87943943-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "249120e8-680c-425e-8ee1-eccf87943943" (UID: "249120e8-680c-425e-8ee1-eccf87943943"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:03:28.410186 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:28.410163 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/249120e8-680c-425e-8ee1-eccf87943943-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "249120e8-680c-425e-8ee1-eccf87943943" (UID: "249120e8-680c-425e-8ee1-eccf87943943"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:03:28.410186 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:28.410175 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/249120e8-680c-425e-8ee1-eccf87943943-kube-api-access-xlxww" (OuterVolumeSpecName: "kube-api-access-xlxww") pod "249120e8-680c-425e-8ee1-eccf87943943" (UID: "249120e8-680c-425e-8ee1-eccf87943943"). InnerVolumeSpecName "kube-api-access-xlxww". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:03:28.508543 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:28.508505 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/249120e8-680c-425e-8ee1-eccf87943943-console-config\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:03:28.508543 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:28.508538 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/249120e8-680c-425e-8ee1-eccf87943943-console-oauth-config\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:03:28.508543 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:28.508549 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/249120e8-680c-425e-8ee1-eccf87943943-oauth-serving-cert\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:03:28.508783 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:28.508559 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xlxww\" (UniqueName: \"kubernetes.io/projected/249120e8-680c-425e-8ee1-eccf87943943-kube-api-access-xlxww\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:03:28.508783 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:28.508568 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/249120e8-680c-425e-8ee1-eccf87943943-trusted-ca-bundle\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:03:28.508783 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:28.508578 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/249120e8-680c-425e-8ee1-eccf87943943-console-serving-cert\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:03:29.081375 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:29.081347 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-54fd69787c-44b7j_249120e8-680c-425e-8ee1-eccf87943943/console/0.log" Apr 21 04:03:29.081775 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:29.081389 2578 generic.go:358] "Generic (PLEG): container finished" podID="249120e8-680c-425e-8ee1-eccf87943943" containerID="645b56f1b0f005e8aeaeb35d8e40fb4040f22bb45766a3b7f0f5c0d26e60ecba" exitCode=2 Apr 21 04:03:29.081775 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:29.081443 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54fd69787c-44b7j" event={"ID":"249120e8-680c-425e-8ee1-eccf87943943","Type":"ContainerDied","Data":"645b56f1b0f005e8aeaeb35d8e40fb4040f22bb45766a3b7f0f5c0d26e60ecba"} Apr 21 04:03:29.081775 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:29.081458 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-54fd69787c-44b7j" Apr 21 04:03:29.081775 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:29.081465 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-54fd69787c-44b7j" event={"ID":"249120e8-680c-425e-8ee1-eccf87943943","Type":"ContainerDied","Data":"a940574be81010657900d1756100887e4d874bc8cd9ae402f3751e8598c4371e"} Apr 21 04:03:29.081775 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:29.081481 2578 scope.go:117] "RemoveContainer" containerID="645b56f1b0f005e8aeaeb35d8e40fb4040f22bb45766a3b7f0f5c0d26e60ecba" Apr 21 04:03:29.089514 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:29.089490 2578 scope.go:117] "RemoveContainer" containerID="645b56f1b0f005e8aeaeb35d8e40fb4040f22bb45766a3b7f0f5c0d26e60ecba" Apr 21 04:03:29.089815 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:03:29.089796 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"645b56f1b0f005e8aeaeb35d8e40fb4040f22bb45766a3b7f0f5c0d26e60ecba\": container with ID starting with 645b56f1b0f005e8aeaeb35d8e40fb4040f22bb45766a3b7f0f5c0d26e60ecba not found: ID does not exist" containerID="645b56f1b0f005e8aeaeb35d8e40fb4040f22bb45766a3b7f0f5c0d26e60ecba" Apr 21 04:03:29.089909 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:29.089829 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"645b56f1b0f005e8aeaeb35d8e40fb4040f22bb45766a3b7f0f5c0d26e60ecba"} err="failed to get container status \"645b56f1b0f005e8aeaeb35d8e40fb4040f22bb45766a3b7f0f5c0d26e60ecba\": rpc error: code = NotFound desc = could not find container \"645b56f1b0f005e8aeaeb35d8e40fb4040f22bb45766a3b7f0f5c0d26e60ecba\": container with ID starting with 645b56f1b0f005e8aeaeb35d8e40fb4040f22bb45766a3b7f0f5c0d26e60ecba not found: ID does not exist" Apr 21 04:03:29.098843 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:29.098816 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-54fd69787c-44b7j"] Apr 21 04:03:29.104432 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:29.104402 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-54fd69787c-44b7j"] Apr 21 04:03:30.928601 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:30.928568 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="249120e8-680c-425e-8ee1-eccf87943943" path="/var/lib/kubelet/pods/249120e8-680c-425e-8ee1-eccf87943943/volumes" Apr 21 04:03:51.480552 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:51.480507 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr5vk8"] Apr 21 04:03:51.481048 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:51.480969 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="249120e8-680c-425e-8ee1-eccf87943943" containerName="console" Apr 21 04:03:51.481048 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:51.480987 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="249120e8-680c-425e-8ee1-eccf87943943" containerName="console" Apr 21 04:03:51.481164 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:51.481091 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="249120e8-680c-425e-8ee1-eccf87943943" containerName="console" Apr 21 04:03:51.484088 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:51.484066 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr5vk8" Apr 21 04:03:51.486985 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:51.486963 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-k6n2p\"" Apr 21 04:03:51.486985 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:51.486963 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 04:03:51.487165 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:51.487017 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 04:03:51.491839 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:51.491617 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr5vk8"] Apr 21 04:03:51.607983 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:51.607941 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpz22\" (UniqueName: \"kubernetes.io/projected/6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d-kube-api-access-dpz22\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr5vk8\" (UID: \"6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr5vk8" Apr 21 04:03:51.608165 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:51.608019 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr5vk8\" (UID: \"6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr5vk8" Apr 21 04:03:51.608165 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:51.608064 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr5vk8\" (UID: \"6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr5vk8" Apr 21 04:03:51.709538 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:51.709506 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dpz22\" (UniqueName: \"kubernetes.io/projected/6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d-kube-api-access-dpz22\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr5vk8\" (UID: \"6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr5vk8" Apr 21 04:03:51.709646 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:51.709572 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr5vk8\" (UID: \"6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr5vk8" Apr 21 04:03:51.709646 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:51.709615 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr5vk8\" (UID: \"6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr5vk8" Apr 21 04:03:51.710066 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:51.710045 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr5vk8\" (UID: \"6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr5vk8" Apr 21 04:03:51.710102 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:51.710055 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr5vk8\" (UID: \"6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr5vk8" Apr 21 04:03:51.718167 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:51.718143 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpz22\" (UniqueName: \"kubernetes.io/projected/6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d-kube-api-access-dpz22\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr5vk8\" (UID: \"6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr5vk8" Apr 21 04:03:51.794370 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:51.794334 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr5vk8" Apr 21 04:03:51.922657 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:51.922626 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr5vk8"] Apr 21 04:03:51.925325 ip-10-0-134-136 kubenswrapper[2578]: W0421 04:03:51.925294 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f2fdecc_1540_4c39_8b32_7ec6de0e0f9d.slice/crio-7c68b2770e8e1d5e629ce1d1184aa4dababb8fe07324f8931264db20e412de7d WatchSource:0}: Error finding container 7c68b2770e8e1d5e629ce1d1184aa4dababb8fe07324f8931264db20e412de7d: Status 404 returned error can't find the container with id 7c68b2770e8e1d5e629ce1d1184aa4dababb8fe07324f8931264db20e412de7d Apr 21 04:03:52.147089 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:52.146997 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr5vk8" event={"ID":"6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d","Type":"ContainerStarted","Data":"7c68b2770e8e1d5e629ce1d1184aa4dababb8fe07324f8931264db20e412de7d"} Apr 21 04:03:57.162507 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:57.162473 2578 generic.go:358] "Generic (PLEG): container finished" podID="6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d" containerID="a8127dd62b6d8ea7703439b3d4794dabce09f838963319569a117c03e18a7dd2" exitCode=0 Apr 21 04:03:57.162862 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:57.162514 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr5vk8" event={"ID":"6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d","Type":"ContainerDied","Data":"a8127dd62b6d8ea7703439b3d4794dabce09f838963319569a117c03e18a7dd2"} Apr 21 04:03:59.171411 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:03:59.171381 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr5vk8" event={"ID":"6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d","Type":"ContainerStarted","Data":"25538fb6cfa8afc8b599145ffbe251648f145112267957fc72beda407106ec50"} Apr 21 04:04:00.175940 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:00.175907 2578 generic.go:358] "Generic (PLEG): container finished" podID="6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d" containerID="25538fb6cfa8afc8b599145ffbe251648f145112267957fc72beda407106ec50" exitCode=0 Apr 21 04:04:00.176346 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:00.175945 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr5vk8" event={"ID":"6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d","Type":"ContainerDied","Data":"25538fb6cfa8afc8b599145ffbe251648f145112267957fc72beda407106ec50"} Apr 21 04:04:06.198971 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:06.198935 2578 generic.go:358] "Generic (PLEG): container finished" podID="6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d" containerID="a8119e3cbca125103a7e122f61227f545133b7de7b7a05d71fd96958496f9d47" exitCode=0 Apr 21 04:04:06.199356 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:06.199013 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr5vk8" event={"ID":"6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d","Type":"ContainerDied","Data":"a8119e3cbca125103a7e122f61227f545133b7de7b7a05d71fd96958496f9d47"} Apr 21 04:04:07.319440 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:07.319411 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr5vk8" Apr 21 04:04:07.350411 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:07.350372 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d-bundle\") pod \"6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d\" (UID: \"6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d\") " Apr 21 04:04:07.350411 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:07.350410 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d-util\") pod \"6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d\" (UID: \"6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d\") " Apr 21 04:04:07.350672 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:07.350461 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpz22\" (UniqueName: \"kubernetes.io/projected/6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d-kube-api-access-dpz22\") pod \"6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d\" (UID: \"6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d\") " Apr 21 04:04:07.350950 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:07.350924 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d-bundle" (OuterVolumeSpecName: "bundle") pod "6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d" (UID: "6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:04:07.352734 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:07.352703 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d-kube-api-access-dpz22" (OuterVolumeSpecName: "kube-api-access-dpz22") pod "6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d" (UID: "6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d"). InnerVolumeSpecName "kube-api-access-dpz22". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:04:07.354991 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:07.354963 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d-util" (OuterVolumeSpecName: "util") pod "6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d" (UID: "6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:04:07.451529 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:07.451495 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dpz22\" (UniqueName: \"kubernetes.io/projected/6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d-kube-api-access-dpz22\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:04:07.451529 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:07.451525 2578 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d-bundle\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:04:07.451529 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:07.451534 2578 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d-util\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:04:08.205571 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:08.205539 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr5vk8" Apr 21 04:04:08.205822 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:08.205546 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr5vk8" event={"ID":"6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d","Type":"ContainerDied","Data":"7c68b2770e8e1d5e629ce1d1184aa4dababb8fe07324f8931264db20e412de7d"} Apr 21 04:04:08.205822 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:08.205655 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c68b2770e8e1d5e629ce1d1184aa4dababb8fe07324f8931264db20e412de7d" Apr 21 04:04:13.091404 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:13.091366 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qz6tc"] Apr 21 04:04:13.091811 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:13.091721 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d" containerName="util" Apr 21 04:04:13.091811 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:13.091734 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d" containerName="util" Apr 21 04:04:13.091811 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:13.091745 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d" containerName="pull" Apr 21 04:04:13.091811 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:13.091751 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d" containerName="pull" Apr 21 04:04:13.091811 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:13.091761 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d" containerName="extract" Apr 21 04:04:13.091811 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:13.091766 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d" containerName="extract" Apr 21 04:04:13.091995 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:13.091825 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="6f2fdecc-1540-4c39-8b32-7ec6de0e0f9d" containerName="extract" Apr 21 04:04:13.138675 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:13.138644 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qz6tc"] Apr 21 04:04:13.138851 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:13.138765 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qz6tc" Apr 21 04:04:13.143532 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:13.143502 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-xg5ql\"" Apr 21 04:04:13.143532 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:13.143527 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 21 04:04:13.143742 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:13.143545 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 21 04:04:13.143812 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:13.143798 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 21 04:04:13.185178 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:13.185141 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/44f3a551-9d1d-4df8-90bd-2529f3606cc8-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-qz6tc\" (UID: \"44f3a551-9d1d-4df8-90bd-2529f3606cc8\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qz6tc" Apr 21 04:04:13.185384 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:13.185184 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rq5pj\" (UniqueName: \"kubernetes.io/projected/44f3a551-9d1d-4df8-90bd-2529f3606cc8-kube-api-access-rq5pj\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-qz6tc\" (UID: \"44f3a551-9d1d-4df8-90bd-2529f3606cc8\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qz6tc" Apr 21 04:04:13.286432 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:13.286396 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/44f3a551-9d1d-4df8-90bd-2529f3606cc8-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-qz6tc\" (UID: \"44f3a551-9d1d-4df8-90bd-2529f3606cc8\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qz6tc" Apr 21 04:04:13.286610 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:13.286443 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rq5pj\" (UniqueName: \"kubernetes.io/projected/44f3a551-9d1d-4df8-90bd-2529f3606cc8-kube-api-access-rq5pj\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-qz6tc\" (UID: \"44f3a551-9d1d-4df8-90bd-2529f3606cc8\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qz6tc" Apr 21 04:04:13.288838 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:13.288805 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/44f3a551-9d1d-4df8-90bd-2529f3606cc8-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-qz6tc\" (UID: \"44f3a551-9d1d-4df8-90bd-2529f3606cc8\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qz6tc" Apr 21 04:04:13.295640 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:13.295615 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rq5pj\" (UniqueName: \"kubernetes.io/projected/44f3a551-9d1d-4df8-90bd-2529f3606cc8-kube-api-access-rq5pj\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-qz6tc\" (UID: \"44f3a551-9d1d-4df8-90bd-2529f3606cc8\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qz6tc" Apr 21 04:04:13.449433 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:13.449332 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qz6tc" Apr 21 04:04:13.580176 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:13.580151 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qz6tc"] Apr 21 04:04:13.584708 ip-10-0-134-136 kubenswrapper[2578]: W0421 04:04:13.584653 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44f3a551_9d1d_4df8_90bd_2529f3606cc8.slice/crio-0f502d79b27eb1b6a965881f948bbba5e64576e331078afaea6880466c1405b8 WatchSource:0}: Error finding container 0f502d79b27eb1b6a965881f948bbba5e64576e331078afaea6880466c1405b8: Status 404 returned error can't find the container with id 0f502d79b27eb1b6a965881f948bbba5e64576e331078afaea6880466c1405b8 Apr 21 04:04:14.227066 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:14.227024 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qz6tc" event={"ID":"44f3a551-9d1d-4df8-90bd-2529f3606cc8","Type":"ContainerStarted","Data":"0f502d79b27eb1b6a965881f948bbba5e64576e331078afaea6880466c1405b8"} Apr 21 04:04:17.747622 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:17.747585 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-5xjgn"] Apr 21 04:04:17.749801 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:17.749784 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-5xjgn" Apr 21 04:04:17.752355 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:17.752330 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 21 04:04:17.752500 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:17.752339 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-lwj7z\"" Apr 21 04:04:17.752500 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:17.752396 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 21 04:04:17.758363 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:17.758334 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-5xjgn"] Apr 21 04:04:17.824751 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:17.824716 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbfsf\" (UniqueName: \"kubernetes.io/projected/99edd0d0-958a-4344-ab2d-a13c47c17925-kube-api-access-hbfsf\") pod \"keda-operator-ffbb595cb-5xjgn\" (UID: \"99edd0d0-958a-4344-ab2d-a13c47c17925\") " pod="openshift-keda/keda-operator-ffbb595cb-5xjgn" Apr 21 04:04:17.824751 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:17.824759 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/99edd0d0-958a-4344-ab2d-a13c47c17925-cabundle0\") pod \"keda-operator-ffbb595cb-5xjgn\" (UID: \"99edd0d0-958a-4344-ab2d-a13c47c17925\") " pod="openshift-keda/keda-operator-ffbb595cb-5xjgn" Apr 21 04:04:17.824963 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:17.824793 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/99edd0d0-958a-4344-ab2d-a13c47c17925-certificates\") pod \"keda-operator-ffbb595cb-5xjgn\" (UID: \"99edd0d0-958a-4344-ab2d-a13c47c17925\") " pod="openshift-keda/keda-operator-ffbb595cb-5xjgn" Apr 21 04:04:17.926048 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:17.926010 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hbfsf\" (UniqueName: \"kubernetes.io/projected/99edd0d0-958a-4344-ab2d-a13c47c17925-kube-api-access-hbfsf\") pod \"keda-operator-ffbb595cb-5xjgn\" (UID: \"99edd0d0-958a-4344-ab2d-a13c47c17925\") " pod="openshift-keda/keda-operator-ffbb595cb-5xjgn" Apr 21 04:04:17.926048 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:17.926061 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/99edd0d0-958a-4344-ab2d-a13c47c17925-cabundle0\") pod \"keda-operator-ffbb595cb-5xjgn\" (UID: \"99edd0d0-958a-4344-ab2d-a13c47c17925\") " pod="openshift-keda/keda-operator-ffbb595cb-5xjgn" Apr 21 04:04:17.926309 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:17.926089 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/99edd0d0-958a-4344-ab2d-a13c47c17925-certificates\") pod \"keda-operator-ffbb595cb-5xjgn\" (UID: \"99edd0d0-958a-4344-ab2d-a13c47c17925\") " pod="openshift-keda/keda-operator-ffbb595cb-5xjgn" Apr 21 04:04:17.926309 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:04:17.926208 2578 secret.go:281] references non-existent secret key: ca.crt Apr 21 04:04:17.926309 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:04:17.926223 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 21 04:04:17.926309 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:04:17.926234 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-5xjgn: references non-existent secret key: ca.crt Apr 21 04:04:17.926463 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:04:17.926323 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/99edd0d0-958a-4344-ab2d-a13c47c17925-certificates podName:99edd0d0-958a-4344-ab2d-a13c47c17925 nodeName:}" failed. No retries permitted until 2026-04-21 04:04:18.426275441 +0000 UTC m=+402.095714507 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/99edd0d0-958a-4344-ab2d-a13c47c17925-certificates") pod "keda-operator-ffbb595cb-5xjgn" (UID: "99edd0d0-958a-4344-ab2d-a13c47c17925") : references non-existent secret key: ca.crt Apr 21 04:04:17.926790 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:17.926769 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/99edd0d0-958a-4344-ab2d-a13c47c17925-cabundle0\") pod \"keda-operator-ffbb595cb-5xjgn\" (UID: \"99edd0d0-958a-4344-ab2d-a13c47c17925\") " pod="openshift-keda/keda-operator-ffbb595cb-5xjgn" Apr 21 04:04:17.935913 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:17.935882 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbfsf\" (UniqueName: \"kubernetes.io/projected/99edd0d0-958a-4344-ab2d-a13c47c17925-kube-api-access-hbfsf\") pod \"keda-operator-ffbb595cb-5xjgn\" (UID: \"99edd0d0-958a-4344-ab2d-a13c47c17925\") " pod="openshift-keda/keda-operator-ffbb595cb-5xjgn" Apr 21 04:04:18.090954 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:18.090916 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-gfmjj"] Apr 21 04:04:18.093352 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:18.093333 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gfmjj" Apr 21 04:04:18.095957 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:18.095936 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 21 04:04:18.102396 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:18.102365 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-gfmjj"] Apr 21 04:04:18.128260 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:18.128226 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/01c48cd6-6498-4fad-97fd-84862384fd36-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-gfmjj\" (UID: \"01c48cd6-6498-4fad-97fd-84862384fd36\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gfmjj" Apr 21 04:04:18.128260 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:18.128259 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/01c48cd6-6498-4fad-97fd-84862384fd36-certificates\") pod \"keda-metrics-apiserver-7c9f485588-gfmjj\" (UID: \"01c48cd6-6498-4fad-97fd-84862384fd36\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gfmjj" Apr 21 04:04:18.128486 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:18.128344 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd9tt\" (UniqueName: \"kubernetes.io/projected/01c48cd6-6498-4fad-97fd-84862384fd36-kube-api-access-fd9tt\") pod \"keda-metrics-apiserver-7c9f485588-gfmjj\" (UID: \"01c48cd6-6498-4fad-97fd-84862384fd36\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gfmjj" Apr 21 04:04:18.229800 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:18.229758 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fd9tt\" (UniqueName: \"kubernetes.io/projected/01c48cd6-6498-4fad-97fd-84862384fd36-kube-api-access-fd9tt\") pod \"keda-metrics-apiserver-7c9f485588-gfmjj\" (UID: \"01c48cd6-6498-4fad-97fd-84862384fd36\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gfmjj" Apr 21 04:04:18.230000 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:18.229827 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/01c48cd6-6498-4fad-97fd-84862384fd36-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-gfmjj\" (UID: \"01c48cd6-6498-4fad-97fd-84862384fd36\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gfmjj" Apr 21 04:04:18.230000 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:18.229855 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/01c48cd6-6498-4fad-97fd-84862384fd36-certificates\") pod \"keda-metrics-apiserver-7c9f485588-gfmjj\" (UID: \"01c48cd6-6498-4fad-97fd-84862384fd36\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gfmjj" Apr 21 04:04:18.230000 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:04:18.229974 2578 secret.go:281] references non-existent secret key: tls.crt Apr 21 04:04:18.230167 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:04:18.230000 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 21 04:04:18.230167 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:04:18.230020 2578 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Apr 21 04:04:18.230167 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:04:18.230043 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-gfmjj: [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 21 04:04:18.230167 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:04:18.230126 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/01c48cd6-6498-4fad-97fd-84862384fd36-certificates podName:01c48cd6-6498-4fad-97fd-84862384fd36 nodeName:}" failed. No retries permitted until 2026-04-21 04:04:18.730103843 +0000 UTC m=+402.399542913 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/01c48cd6-6498-4fad-97fd-84862384fd36-certificates") pod "keda-metrics-apiserver-7c9f485588-gfmjj" (UID: "01c48cd6-6498-4fad-97fd-84862384fd36") : [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 21 04:04:18.230417 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:18.230305 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/01c48cd6-6498-4fad-97fd-84862384fd36-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-gfmjj\" (UID: \"01c48cd6-6498-4fad-97fd-84862384fd36\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gfmjj" Apr 21 04:04:18.238917 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:18.238891 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd9tt\" (UniqueName: \"kubernetes.io/projected/01c48cd6-6498-4fad-97fd-84862384fd36-kube-api-access-fd9tt\") pod \"keda-metrics-apiserver-7c9f485588-gfmjj\" (UID: \"01c48cd6-6498-4fad-97fd-84862384fd36\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gfmjj" Apr 21 04:04:18.244298 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:18.244245 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qz6tc" event={"ID":"44f3a551-9d1d-4df8-90bd-2529f3606cc8","Type":"ContainerStarted","Data":"fdfab209e205ec58818e2bc4d3caa361845c08702f4d6fe35566ef28989f2560"} Apr 21 04:04:18.244432 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:18.244357 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qz6tc" Apr 21 04:04:18.262179 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:18.262119 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qz6tc" podStartSLOduration=1.615832025 podStartE2EDuration="5.262101797s" podCreationTimestamp="2026-04-21 04:04:13 +0000 UTC" firstStartedPulling="2026-04-21 04:04:13.587141715 +0000 UTC m=+397.256580767" lastFinishedPulling="2026-04-21 04:04:17.233411486 +0000 UTC m=+400.902850539" observedRunningTime="2026-04-21 04:04:18.26171551 +0000 UTC m=+401.931154586" watchObservedRunningTime="2026-04-21 04:04:18.262101797 +0000 UTC m=+401.931540877" Apr 21 04:04:18.364031 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:18.363947 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-wv8mk"] Apr 21 04:04:18.366354 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:18.366336 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-wv8mk" Apr 21 04:04:18.369059 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:18.369037 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 21 04:04:18.377130 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:18.377105 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-wv8mk"] Apr 21 04:04:18.431415 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:18.431366 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5r8w2\" (UniqueName: \"kubernetes.io/projected/6529d772-cb72-4a0e-9a7b-6f506c0fc0e2-kube-api-access-5r8w2\") pod \"keda-admission-cf49989db-wv8mk\" (UID: \"6529d772-cb72-4a0e-9a7b-6f506c0fc0e2\") " pod="openshift-keda/keda-admission-cf49989db-wv8mk" Apr 21 04:04:18.431415 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:18.431416 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/99edd0d0-958a-4344-ab2d-a13c47c17925-certificates\") pod \"keda-operator-ffbb595cb-5xjgn\" (UID: \"99edd0d0-958a-4344-ab2d-a13c47c17925\") " pod="openshift-keda/keda-operator-ffbb595cb-5xjgn" Apr 21 04:04:18.431676 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:18.431486 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6529d772-cb72-4a0e-9a7b-6f506c0fc0e2-certificates\") pod \"keda-admission-cf49989db-wv8mk\" (UID: \"6529d772-cb72-4a0e-9a7b-6f506c0fc0e2\") " pod="openshift-keda/keda-admission-cf49989db-wv8mk" Apr 21 04:04:18.431676 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:04:18.431527 2578 secret.go:281] references non-existent secret key: ca.crt Apr 21 04:04:18.431676 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:04:18.431539 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 21 04:04:18.431676 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:04:18.431548 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-5xjgn: references non-existent secret key: ca.crt Apr 21 04:04:18.431676 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:04:18.431597 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/99edd0d0-958a-4344-ab2d-a13c47c17925-certificates podName:99edd0d0-958a-4344-ab2d-a13c47c17925 nodeName:}" failed. No retries permitted until 2026-04-21 04:04:19.431582943 +0000 UTC m=+403.101022031 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/99edd0d0-958a-4344-ab2d-a13c47c17925-certificates") pod "keda-operator-ffbb595cb-5xjgn" (UID: "99edd0d0-958a-4344-ab2d-a13c47c17925") : references non-existent secret key: ca.crt Apr 21 04:04:18.532866 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:18.532827 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5r8w2\" (UniqueName: \"kubernetes.io/projected/6529d772-cb72-4a0e-9a7b-6f506c0fc0e2-kube-api-access-5r8w2\") pod \"keda-admission-cf49989db-wv8mk\" (UID: \"6529d772-cb72-4a0e-9a7b-6f506c0fc0e2\") " pod="openshift-keda/keda-admission-cf49989db-wv8mk" Apr 21 04:04:18.533047 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:18.532905 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6529d772-cb72-4a0e-9a7b-6f506c0fc0e2-certificates\") pod \"keda-admission-cf49989db-wv8mk\" (UID: \"6529d772-cb72-4a0e-9a7b-6f506c0fc0e2\") " pod="openshift-keda/keda-admission-cf49989db-wv8mk" Apr 21 04:04:18.533094 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:04:18.533045 2578 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 21 04:04:18.533094 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:04:18.533071 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-wv8mk: secret "keda-admission-webhooks-certs" not found Apr 21 04:04:18.533179 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:04:18.533117 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6529d772-cb72-4a0e-9a7b-6f506c0fc0e2-certificates podName:6529d772-cb72-4a0e-9a7b-6f506c0fc0e2 nodeName:}" failed. No retries permitted until 2026-04-21 04:04:19.033100752 +0000 UTC m=+402.702539808 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/6529d772-cb72-4a0e-9a7b-6f506c0fc0e2-certificates") pod "keda-admission-cf49989db-wv8mk" (UID: "6529d772-cb72-4a0e-9a7b-6f506c0fc0e2") : secret "keda-admission-webhooks-certs" not found Apr 21 04:04:18.544937 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:18.544900 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5r8w2\" (UniqueName: \"kubernetes.io/projected/6529d772-cb72-4a0e-9a7b-6f506c0fc0e2-kube-api-access-5r8w2\") pod \"keda-admission-cf49989db-wv8mk\" (UID: \"6529d772-cb72-4a0e-9a7b-6f506c0fc0e2\") " pod="openshift-keda/keda-admission-cf49989db-wv8mk" Apr 21 04:04:18.735798 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:18.735692 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/01c48cd6-6498-4fad-97fd-84862384fd36-certificates\") pod \"keda-metrics-apiserver-7c9f485588-gfmjj\" (UID: \"01c48cd6-6498-4fad-97fd-84862384fd36\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gfmjj" Apr 21 04:04:18.735973 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:04:18.735864 2578 secret.go:281] references non-existent secret key: tls.crt Apr 21 04:04:18.735973 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:04:18.735881 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 21 04:04:18.735973 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:04:18.735904 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-gfmjj: references non-existent secret key: tls.crt Apr 21 04:04:18.735973 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:04:18.735973 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/01c48cd6-6498-4fad-97fd-84862384fd36-certificates podName:01c48cd6-6498-4fad-97fd-84862384fd36 nodeName:}" failed. No retries permitted until 2026-04-21 04:04:19.735954852 +0000 UTC m=+403.405393904 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/01c48cd6-6498-4fad-97fd-84862384fd36-certificates") pod "keda-metrics-apiserver-7c9f485588-gfmjj" (UID: "01c48cd6-6498-4fad-97fd-84862384fd36") : references non-existent secret key: tls.crt Apr 21 04:04:19.038720 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:19.038681 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6529d772-cb72-4a0e-9a7b-6f506c0fc0e2-certificates\") pod \"keda-admission-cf49989db-wv8mk\" (UID: \"6529d772-cb72-4a0e-9a7b-6f506c0fc0e2\") " pod="openshift-keda/keda-admission-cf49989db-wv8mk" Apr 21 04:04:19.041367 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:19.041336 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6529d772-cb72-4a0e-9a7b-6f506c0fc0e2-certificates\") pod \"keda-admission-cf49989db-wv8mk\" (UID: \"6529d772-cb72-4a0e-9a7b-6f506c0fc0e2\") " pod="openshift-keda/keda-admission-cf49989db-wv8mk" Apr 21 04:04:19.278232 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:19.278197 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-wv8mk" Apr 21 04:04:19.412115 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:19.412090 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-wv8mk"] Apr 21 04:04:19.414299 ip-10-0-134-136 kubenswrapper[2578]: W0421 04:04:19.414255 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6529d772_cb72_4a0e_9a7b_6f506c0fc0e2.slice/crio-cb3852fbdefdda6a42fdebd8426c59997598c3fdcd7e1637ea99d0e4013ecd62 WatchSource:0}: Error finding container cb3852fbdefdda6a42fdebd8426c59997598c3fdcd7e1637ea99d0e4013ecd62: Status 404 returned error can't find the container with id cb3852fbdefdda6a42fdebd8426c59997598c3fdcd7e1637ea99d0e4013ecd62 Apr 21 04:04:19.441846 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:19.441809 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/99edd0d0-958a-4344-ab2d-a13c47c17925-certificates\") pod \"keda-operator-ffbb595cb-5xjgn\" (UID: \"99edd0d0-958a-4344-ab2d-a13c47c17925\") " pod="openshift-keda/keda-operator-ffbb595cb-5xjgn" Apr 21 04:04:19.442038 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:04:19.441958 2578 secret.go:281] references non-existent secret key: ca.crt Apr 21 04:04:19.442038 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:04:19.441980 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 21 04:04:19.442038 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:04:19.441991 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-5xjgn: references non-existent secret key: ca.crt Apr 21 04:04:19.442187 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:04:19.442051 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/99edd0d0-958a-4344-ab2d-a13c47c17925-certificates podName:99edd0d0-958a-4344-ab2d-a13c47c17925 nodeName:}" failed. No retries permitted until 2026-04-21 04:04:21.442029264 +0000 UTC m=+405.111468317 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/99edd0d0-958a-4344-ab2d-a13c47c17925-certificates") pod "keda-operator-ffbb595cb-5xjgn" (UID: "99edd0d0-958a-4344-ab2d-a13c47c17925") : references non-existent secret key: ca.crt Apr 21 04:04:19.744320 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:19.744189 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/01c48cd6-6498-4fad-97fd-84862384fd36-certificates\") pod \"keda-metrics-apiserver-7c9f485588-gfmjj\" (UID: \"01c48cd6-6498-4fad-97fd-84862384fd36\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gfmjj" Apr 21 04:04:19.744502 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:04:19.744344 2578 secret.go:281] references non-existent secret key: tls.crt Apr 21 04:04:19.744502 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:04:19.744363 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 21 04:04:19.744502 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:04:19.744390 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-gfmjj: references non-existent secret key: tls.crt Apr 21 04:04:19.744502 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:04:19.744455 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/01c48cd6-6498-4fad-97fd-84862384fd36-certificates podName:01c48cd6-6498-4fad-97fd-84862384fd36 nodeName:}" failed. No retries permitted until 2026-04-21 04:04:21.744436324 +0000 UTC m=+405.413875377 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/01c48cd6-6498-4fad-97fd-84862384fd36-certificates") pod "keda-metrics-apiserver-7c9f485588-gfmjj" (UID: "01c48cd6-6498-4fad-97fd-84862384fd36") : references non-existent secret key: tls.crt Apr 21 04:04:20.252364 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:20.252313 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-wv8mk" event={"ID":"6529d772-cb72-4a0e-9a7b-6f506c0fc0e2","Type":"ContainerStarted","Data":"cb3852fbdefdda6a42fdebd8426c59997598c3fdcd7e1637ea99d0e4013ecd62"} Apr 21 04:04:21.257369 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:21.257318 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-wv8mk" event={"ID":"6529d772-cb72-4a0e-9a7b-6f506c0fc0e2","Type":"ContainerStarted","Data":"806f705bec803d083c70d85d3db313101dd8df797b806e3dc368ec1dbc318241"} Apr 21 04:04:21.257963 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:21.257459 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-wv8mk" Apr 21 04:04:21.276223 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:21.276174 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-wv8mk" podStartSLOduration=1.984280863 podStartE2EDuration="3.276161949s" podCreationTimestamp="2026-04-21 04:04:18 +0000 UTC" firstStartedPulling="2026-04-21 04:04:19.415645553 +0000 UTC m=+403.085084607" lastFinishedPulling="2026-04-21 04:04:20.707526637 +0000 UTC m=+404.376965693" observedRunningTime="2026-04-21 04:04:21.273510411 +0000 UTC m=+404.942949483" watchObservedRunningTime="2026-04-21 04:04:21.276161949 +0000 UTC m=+404.945601025" Apr 21 04:04:21.458505 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:21.458470 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/99edd0d0-958a-4344-ab2d-a13c47c17925-certificates\") pod \"keda-operator-ffbb595cb-5xjgn\" (UID: \"99edd0d0-958a-4344-ab2d-a13c47c17925\") " pod="openshift-keda/keda-operator-ffbb595cb-5xjgn" Apr 21 04:04:21.458656 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:04:21.458625 2578 secret.go:281] references non-existent secret key: ca.crt Apr 21 04:04:21.458656 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:04:21.458647 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 21 04:04:21.458656 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:04:21.458656 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-5xjgn: references non-existent secret key: ca.crt Apr 21 04:04:21.458772 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:04:21.458707 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/99edd0d0-958a-4344-ab2d-a13c47c17925-certificates podName:99edd0d0-958a-4344-ab2d-a13c47c17925 nodeName:}" failed. No retries permitted until 2026-04-21 04:04:25.458691704 +0000 UTC m=+409.128130757 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/99edd0d0-958a-4344-ab2d-a13c47c17925-certificates") pod "keda-operator-ffbb595cb-5xjgn" (UID: "99edd0d0-958a-4344-ab2d-a13c47c17925") : references non-existent secret key: ca.crt Apr 21 04:04:21.762049 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:21.762006 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/01c48cd6-6498-4fad-97fd-84862384fd36-certificates\") pod \"keda-metrics-apiserver-7c9f485588-gfmjj\" (UID: \"01c48cd6-6498-4fad-97fd-84862384fd36\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gfmjj" Apr 21 04:04:21.762235 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:04:21.762162 2578 secret.go:281] references non-existent secret key: tls.crt Apr 21 04:04:21.762235 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:04:21.762185 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 21 04:04:21.762235 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:04:21.762205 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-gfmjj: references non-existent secret key: tls.crt Apr 21 04:04:21.762369 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:04:21.762257 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/01c48cd6-6498-4fad-97fd-84862384fd36-certificates podName:01c48cd6-6498-4fad-97fd-84862384fd36 nodeName:}" failed. No retries permitted until 2026-04-21 04:04:25.762241259 +0000 UTC m=+409.431680312 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/01c48cd6-6498-4fad-97fd-84862384fd36-certificates") pod "keda-metrics-apiserver-7c9f485588-gfmjj" (UID: "01c48cd6-6498-4fad-97fd-84862384fd36") : references non-existent secret key: tls.crt Apr 21 04:04:25.493942 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:25.493897 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/99edd0d0-958a-4344-ab2d-a13c47c17925-certificates\") pod \"keda-operator-ffbb595cb-5xjgn\" (UID: \"99edd0d0-958a-4344-ab2d-a13c47c17925\") " pod="openshift-keda/keda-operator-ffbb595cb-5xjgn" Apr 21 04:04:25.496346 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:25.496326 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/99edd0d0-958a-4344-ab2d-a13c47c17925-certificates\") pod \"keda-operator-ffbb595cb-5xjgn\" (UID: \"99edd0d0-958a-4344-ab2d-a13c47c17925\") " pod="openshift-keda/keda-operator-ffbb595cb-5xjgn" Apr 21 04:04:25.561303 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:25.561238 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-5xjgn" Apr 21 04:04:25.696778 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:25.696740 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-5xjgn"] Apr 21 04:04:25.697254 ip-10-0-134-136 kubenswrapper[2578]: W0421 04:04:25.697228 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99edd0d0_958a_4344_ab2d_a13c47c17925.slice/crio-443d1d18eb2850a45d67e9e569c00003b6f640b1fbb65ca7947a33e671643274 WatchSource:0}: Error finding container 443d1d18eb2850a45d67e9e569c00003b6f640b1fbb65ca7947a33e671643274: Status 404 returned error can't find the container with id 443d1d18eb2850a45d67e9e569c00003b6f640b1fbb65ca7947a33e671643274 Apr 21 04:04:25.796347 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:25.796306 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/01c48cd6-6498-4fad-97fd-84862384fd36-certificates\") pod \"keda-metrics-apiserver-7c9f485588-gfmjj\" (UID: \"01c48cd6-6498-4fad-97fd-84862384fd36\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gfmjj" Apr 21 04:04:25.798829 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:25.798801 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/01c48cd6-6498-4fad-97fd-84862384fd36-certificates\") pod \"keda-metrics-apiserver-7c9f485588-gfmjj\" (UID: \"01c48cd6-6498-4fad-97fd-84862384fd36\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gfmjj" Apr 21 04:04:25.905263 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:25.905225 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gfmjj" Apr 21 04:04:26.032554 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:26.032519 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-gfmjj"] Apr 21 04:04:26.036195 ip-10-0-134-136 kubenswrapper[2578]: W0421 04:04:26.036162 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01c48cd6_6498_4fad_97fd_84862384fd36.slice/crio-cf8adcf778a46b0589447ec229a88c74a0a475e70147501f5f14d24cc6aa8827 WatchSource:0}: Error finding container cf8adcf778a46b0589447ec229a88c74a0a475e70147501f5f14d24cc6aa8827: Status 404 returned error can't find the container with id cf8adcf778a46b0589447ec229a88c74a0a475e70147501f5f14d24cc6aa8827 Apr 21 04:04:26.274948 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:26.274905 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gfmjj" event={"ID":"01c48cd6-6498-4fad-97fd-84862384fd36","Type":"ContainerStarted","Data":"cf8adcf778a46b0589447ec229a88c74a0a475e70147501f5f14d24cc6aa8827"} Apr 21 04:04:26.276448 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:26.276412 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-5xjgn" event={"ID":"99edd0d0-958a-4344-ab2d-a13c47c17925","Type":"ContainerStarted","Data":"443d1d18eb2850a45d67e9e569c00003b6f640b1fbb65ca7947a33e671643274"} Apr 21 04:04:30.293141 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:30.293106 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gfmjj" event={"ID":"01c48cd6-6498-4fad-97fd-84862384fd36","Type":"ContainerStarted","Data":"d286ee4a50bdd29ef36b48c4928a2a27686a03f606fe5c549ce360d6152617dd"} Apr 21 04:04:30.293589 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:30.293405 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gfmjj" Apr 21 04:04:30.294703 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:30.294676 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-5xjgn" event={"ID":"99edd0d0-958a-4344-ab2d-a13c47c17925","Type":"ContainerStarted","Data":"448919ba10ab811b5c9815d2b333d5409c4d39d5d6cb538ab56a8bf4d7e1e2f0"} Apr 21 04:04:30.294851 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:30.294772 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-5xjgn" Apr 21 04:04:30.309297 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:30.309215 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gfmjj" podStartSLOduration=8.918663791 podStartE2EDuration="12.309194924s" podCreationTimestamp="2026-04-21 04:04:18 +0000 UTC" firstStartedPulling="2026-04-21 04:04:26.037796543 +0000 UTC m=+409.707235599" lastFinishedPulling="2026-04-21 04:04:29.428327667 +0000 UTC m=+413.097766732" observedRunningTime="2026-04-21 04:04:30.308695304 +0000 UTC m=+413.978134393" watchObservedRunningTime="2026-04-21 04:04:30.309194924 +0000 UTC m=+413.978634000" Apr 21 04:04:30.323340 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:30.323264 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-5xjgn" podStartSLOduration=9.593627819 podStartE2EDuration="13.323245995s" podCreationTimestamp="2026-04-21 04:04:17 +0000 UTC" firstStartedPulling="2026-04-21 04:04:25.698678611 +0000 UTC m=+409.368117665" lastFinishedPulling="2026-04-21 04:04:29.428296785 +0000 UTC m=+413.097735841" observedRunningTime="2026-04-21 04:04:30.322326471 +0000 UTC m=+413.991765549" watchObservedRunningTime="2026-04-21 04:04:30.323245995 +0000 UTC m=+413.992685070" Apr 21 04:04:39.249763 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:39.249737 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qz6tc" Apr 21 04:04:41.304349 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:41.304320 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-gfmjj" Apr 21 04:04:42.263809 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:42.263779 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-wv8mk" Apr 21 04:04:51.302534 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:04:51.302460 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-5xjgn" Apr 21 04:05:11.326087 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:11.326052 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6n525"] Apr 21 04:05:11.335188 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:11.335161 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6n525" Apr 21 04:05:11.337631 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:11.337591 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6n525"] Apr 21 04:05:11.337806 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:11.337782 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 04:05:11.337926 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:11.337876 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 04:05:11.339029 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:11.339007 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-k6n2p\"" Apr 21 04:05:11.405620 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:11.405580 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/67cf61dc-aef9-4ee9-b879-6b61bdef1215-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6n525\" (UID: \"67cf61dc-aef9-4ee9-b879-6b61bdef1215\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6n525" Apr 21 04:05:11.405807 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:11.405628 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkdcm\" (UniqueName: \"kubernetes.io/projected/67cf61dc-aef9-4ee9-b879-6b61bdef1215-kube-api-access-wkdcm\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6n525\" (UID: \"67cf61dc-aef9-4ee9-b879-6b61bdef1215\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6n525" Apr 21 04:05:11.405807 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:11.405738 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/67cf61dc-aef9-4ee9-b879-6b61bdef1215-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6n525\" (UID: \"67cf61dc-aef9-4ee9-b879-6b61bdef1215\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6n525" Apr 21 04:05:11.506409 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:11.506367 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/67cf61dc-aef9-4ee9-b879-6b61bdef1215-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6n525\" (UID: \"67cf61dc-aef9-4ee9-b879-6b61bdef1215\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6n525" Apr 21 04:05:11.506595 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:11.506459 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wkdcm\" (UniqueName: \"kubernetes.io/projected/67cf61dc-aef9-4ee9-b879-6b61bdef1215-kube-api-access-wkdcm\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6n525\" (UID: \"67cf61dc-aef9-4ee9-b879-6b61bdef1215\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6n525" Apr 21 04:05:11.506640 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:11.506617 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/67cf61dc-aef9-4ee9-b879-6b61bdef1215-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6n525\" (UID: \"67cf61dc-aef9-4ee9-b879-6b61bdef1215\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6n525" Apr 21 04:05:11.506801 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:11.506786 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/67cf61dc-aef9-4ee9-b879-6b61bdef1215-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6n525\" (UID: \"67cf61dc-aef9-4ee9-b879-6b61bdef1215\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6n525" Apr 21 04:05:11.506942 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:11.506923 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/67cf61dc-aef9-4ee9-b879-6b61bdef1215-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6n525\" (UID: \"67cf61dc-aef9-4ee9-b879-6b61bdef1215\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6n525" Apr 21 04:05:11.514986 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:11.514965 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkdcm\" (UniqueName: \"kubernetes.io/projected/67cf61dc-aef9-4ee9-b879-6b61bdef1215-kube-api-access-wkdcm\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6n525\" (UID: \"67cf61dc-aef9-4ee9-b879-6b61bdef1215\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6n525" Apr 21 04:05:11.644869 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:11.644777 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6n525" Apr 21 04:05:11.767091 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:11.767058 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6n525"] Apr 21 04:05:11.770272 ip-10-0-134-136 kubenswrapper[2578]: W0421 04:05:11.770232 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67cf61dc_aef9_4ee9_b879_6b61bdef1215.slice/crio-34eafc09a7cf1785f2b5e29dfe54e3698e7b9b8a85a368b9895108d30e086dd5 WatchSource:0}: Error finding container 34eafc09a7cf1785f2b5e29dfe54e3698e7b9b8a85a368b9895108d30e086dd5: Status 404 returned error can't find the container with id 34eafc09a7cf1785f2b5e29dfe54e3698e7b9b8a85a368b9895108d30e086dd5 Apr 21 04:05:12.439186 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:12.439151 2578 generic.go:358] "Generic (PLEG): container finished" podID="67cf61dc-aef9-4ee9-b879-6b61bdef1215" containerID="0e7e904e94579861461d5c62532e13bd78a708742025f4a2d387d4cf692ed75f" exitCode=0 Apr 21 04:05:12.439591 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:12.439240 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6n525" event={"ID":"67cf61dc-aef9-4ee9-b879-6b61bdef1215","Type":"ContainerDied","Data":"0e7e904e94579861461d5c62532e13bd78a708742025f4a2d387d4cf692ed75f"} Apr 21 04:05:12.439591 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:12.439274 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6n525" event={"ID":"67cf61dc-aef9-4ee9-b879-6b61bdef1215","Type":"ContainerStarted","Data":"34eafc09a7cf1785f2b5e29dfe54e3698e7b9b8a85a368b9895108d30e086dd5"} Apr 21 04:05:14.448339 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:14.448311 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6n525" event={"ID":"67cf61dc-aef9-4ee9-b879-6b61bdef1215","Type":"ContainerStarted","Data":"d3dcbc13561d9ef649b5c44066f9b087bb1ed4c0646ec73cbc7d5b62618493f7"} Apr 21 04:05:15.453085 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:15.453052 2578 generic.go:358] "Generic (PLEG): container finished" podID="67cf61dc-aef9-4ee9-b879-6b61bdef1215" containerID="d3dcbc13561d9ef649b5c44066f9b087bb1ed4c0646ec73cbc7d5b62618493f7" exitCode=0 Apr 21 04:05:15.453085 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:15.453078 2578 generic.go:358] "Generic (PLEG): container finished" podID="67cf61dc-aef9-4ee9-b879-6b61bdef1215" containerID="9e659a60a88b10704c6875af8934a66bf24783a39d745d616bf7e6790f1675ed" exitCode=0 Apr 21 04:05:15.453532 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:15.453097 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6n525" event={"ID":"67cf61dc-aef9-4ee9-b879-6b61bdef1215","Type":"ContainerDied","Data":"d3dcbc13561d9ef649b5c44066f9b087bb1ed4c0646ec73cbc7d5b62618493f7"} Apr 21 04:05:15.453532 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:15.453120 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6n525" event={"ID":"67cf61dc-aef9-4ee9-b879-6b61bdef1215","Type":"ContainerDied","Data":"9e659a60a88b10704c6875af8934a66bf24783a39d745d616bf7e6790f1675ed"} Apr 21 04:05:16.578246 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:16.578219 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6n525" Apr 21 04:05:16.652032 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:16.651998 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkdcm\" (UniqueName: \"kubernetes.io/projected/67cf61dc-aef9-4ee9-b879-6b61bdef1215-kube-api-access-wkdcm\") pod \"67cf61dc-aef9-4ee9-b879-6b61bdef1215\" (UID: \"67cf61dc-aef9-4ee9-b879-6b61bdef1215\") " Apr 21 04:05:16.652234 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:16.652048 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/67cf61dc-aef9-4ee9-b879-6b61bdef1215-util\") pod \"67cf61dc-aef9-4ee9-b879-6b61bdef1215\" (UID: \"67cf61dc-aef9-4ee9-b879-6b61bdef1215\") " Apr 21 04:05:16.652234 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:16.652094 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/67cf61dc-aef9-4ee9-b879-6b61bdef1215-bundle\") pod \"67cf61dc-aef9-4ee9-b879-6b61bdef1215\" (UID: \"67cf61dc-aef9-4ee9-b879-6b61bdef1215\") " Apr 21 04:05:16.652870 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:16.652841 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67cf61dc-aef9-4ee9-b879-6b61bdef1215-bundle" (OuterVolumeSpecName: "bundle") pod "67cf61dc-aef9-4ee9-b879-6b61bdef1215" (UID: "67cf61dc-aef9-4ee9-b879-6b61bdef1215"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:05:16.654226 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:16.654204 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67cf61dc-aef9-4ee9-b879-6b61bdef1215-kube-api-access-wkdcm" (OuterVolumeSpecName: "kube-api-access-wkdcm") pod "67cf61dc-aef9-4ee9-b879-6b61bdef1215" (UID: "67cf61dc-aef9-4ee9-b879-6b61bdef1215"). InnerVolumeSpecName "kube-api-access-wkdcm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:05:16.659355 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:16.659329 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67cf61dc-aef9-4ee9-b879-6b61bdef1215-util" (OuterVolumeSpecName: "util") pod "67cf61dc-aef9-4ee9-b879-6b61bdef1215" (UID: "67cf61dc-aef9-4ee9-b879-6b61bdef1215"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:05:16.753602 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:16.753505 2578 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/67cf61dc-aef9-4ee9-b879-6b61bdef1215-util\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:05:16.753602 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:16.753543 2578 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/67cf61dc-aef9-4ee9-b879-6b61bdef1215-bundle\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:05:16.753602 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:16.753556 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wkdcm\" (UniqueName: \"kubernetes.io/projected/67cf61dc-aef9-4ee9-b879-6b61bdef1215-kube-api-access-wkdcm\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:05:17.461434 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:17.461390 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6n525" event={"ID":"67cf61dc-aef9-4ee9-b879-6b61bdef1215","Type":"ContainerDied","Data":"34eafc09a7cf1785f2b5e29dfe54e3698e7b9b8a85a368b9895108d30e086dd5"} Apr 21 04:05:17.461434 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:17.461422 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34eafc09a7cf1785f2b5e29dfe54e3698e7b9b8a85a368b9895108d30e086dd5" Apr 21 04:05:17.461434 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:17.461425 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6n525" Apr 21 04:05:33.432718 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:33.432683 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzw7n"] Apr 21 04:05:33.433085 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:33.433048 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="67cf61dc-aef9-4ee9-b879-6b61bdef1215" containerName="extract" Apr 21 04:05:33.433085 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:33.433059 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="67cf61dc-aef9-4ee9-b879-6b61bdef1215" containerName="extract" Apr 21 04:05:33.433085 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:33.433074 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="67cf61dc-aef9-4ee9-b879-6b61bdef1215" containerName="util" Apr 21 04:05:33.433085 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:33.433080 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="67cf61dc-aef9-4ee9-b879-6b61bdef1215" containerName="util" Apr 21 04:05:33.433206 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:33.433092 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="67cf61dc-aef9-4ee9-b879-6b61bdef1215" containerName="pull" Apr 21 04:05:33.433206 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:33.433098 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="67cf61dc-aef9-4ee9-b879-6b61bdef1215" containerName="pull" Apr 21 04:05:33.433206 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:33.433150 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="67cf61dc-aef9-4ee9-b879-6b61bdef1215" containerName="extract" Apr 21 04:05:33.436293 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:33.436266 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzw7n" Apr 21 04:05:33.438705 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:33.438677 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 04:05:33.439727 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:33.439711 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 04:05:33.439727 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:33.439719 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-k6n2p\"" Apr 21 04:05:33.443578 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:33.443550 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzw7n"] Apr 21 04:05:33.496396 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:33.496358 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/edfbb979-22f1-4a0d-ab4f-cff09e420a92-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzw7n\" (UID: \"edfbb979-22f1-4a0d-ab4f-cff09e420a92\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzw7n" Apr 21 04:05:33.496396 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:33.496396 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsxww\" (UniqueName: \"kubernetes.io/projected/edfbb979-22f1-4a0d-ab4f-cff09e420a92-kube-api-access-xsxww\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzw7n\" (UID: \"edfbb979-22f1-4a0d-ab4f-cff09e420a92\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzw7n" Apr 21 04:05:33.496650 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:33.496511 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/edfbb979-22f1-4a0d-ab4f-cff09e420a92-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzw7n\" (UID: \"edfbb979-22f1-4a0d-ab4f-cff09e420a92\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzw7n" Apr 21 04:05:33.597577 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:33.597535 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/edfbb979-22f1-4a0d-ab4f-cff09e420a92-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzw7n\" (UID: \"edfbb979-22f1-4a0d-ab4f-cff09e420a92\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzw7n" Apr 21 04:05:33.597577 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:33.597579 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xsxww\" (UniqueName: \"kubernetes.io/projected/edfbb979-22f1-4a0d-ab4f-cff09e420a92-kube-api-access-xsxww\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzw7n\" (UID: \"edfbb979-22f1-4a0d-ab4f-cff09e420a92\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzw7n" Apr 21 04:05:33.597829 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:33.597612 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/edfbb979-22f1-4a0d-ab4f-cff09e420a92-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzw7n\" (UID: \"edfbb979-22f1-4a0d-ab4f-cff09e420a92\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzw7n" Apr 21 04:05:33.597943 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:33.597921 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/edfbb979-22f1-4a0d-ab4f-cff09e420a92-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzw7n\" (UID: \"edfbb979-22f1-4a0d-ab4f-cff09e420a92\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzw7n" Apr 21 04:05:33.597993 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:33.597978 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/edfbb979-22f1-4a0d-ab4f-cff09e420a92-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzw7n\" (UID: \"edfbb979-22f1-4a0d-ab4f-cff09e420a92\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzw7n" Apr 21 04:05:33.605520 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:33.605491 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsxww\" (UniqueName: \"kubernetes.io/projected/edfbb979-22f1-4a0d-ab4f-cff09e420a92-kube-api-access-xsxww\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzw7n\" (UID: \"edfbb979-22f1-4a0d-ab4f-cff09e420a92\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzw7n" Apr 21 04:05:33.746233 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:33.746127 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzw7n" Apr 21 04:05:33.873965 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:33.873940 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzw7n"] Apr 21 04:05:33.876127 ip-10-0-134-136 kubenswrapper[2578]: W0421 04:05:33.876097 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedfbb979_22f1_4a0d_ab4f_cff09e420a92.slice/crio-f73ccce12363eb8d94fd1a37b0e3419d46577b6a01742fb0b32b51e73c58d73a WatchSource:0}: Error finding container f73ccce12363eb8d94fd1a37b0e3419d46577b6a01742fb0b32b51e73c58d73a: Status 404 returned error can't find the container with id f73ccce12363eb8d94fd1a37b0e3419d46577b6a01742fb0b32b51e73c58d73a Apr 21 04:05:34.516017 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:34.515983 2578 generic.go:358] "Generic (PLEG): container finished" podID="edfbb979-22f1-4a0d-ab4f-cff09e420a92" containerID="c89bc34a25d78d45f3795bed8d162056364bc0f5003b4e3fdfe48b18a43bf586" exitCode=0 Apr 21 04:05:34.516524 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:34.516057 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzw7n" event={"ID":"edfbb979-22f1-4a0d-ab4f-cff09e420a92","Type":"ContainerDied","Data":"c89bc34a25d78d45f3795bed8d162056364bc0f5003b4e3fdfe48b18a43bf586"} Apr 21 04:05:34.516524 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:34.516085 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzw7n" event={"ID":"edfbb979-22f1-4a0d-ab4f-cff09e420a92","Type":"ContainerStarted","Data":"f73ccce12363eb8d94fd1a37b0e3419d46577b6a01742fb0b32b51e73c58d73a"} Apr 21 04:05:37.528348 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:37.528312 2578 generic.go:358] "Generic (PLEG): container finished" podID="edfbb979-22f1-4a0d-ab4f-cff09e420a92" containerID="0de5338859bf937e65b92a9d813451033be7f6a105629162a0ccee4acbaec9fb" exitCode=0 Apr 21 04:05:37.528718 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:37.528397 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzw7n" event={"ID":"edfbb979-22f1-4a0d-ab4f-cff09e420a92","Type":"ContainerDied","Data":"0de5338859bf937e65b92a9d813451033be7f6a105629162a0ccee4acbaec9fb"} Apr 21 04:05:38.533937 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:38.533906 2578 generic.go:358] "Generic (PLEG): container finished" podID="edfbb979-22f1-4a0d-ab4f-cff09e420a92" containerID="2bd94cc4cedbab3e85430b21d21a105501b839386d24a59a144be054eafaaa0d" exitCode=0 Apr 21 04:05:38.534353 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:38.533980 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzw7n" event={"ID":"edfbb979-22f1-4a0d-ab4f-cff09e420a92","Type":"ContainerDied","Data":"2bd94cc4cedbab3e85430b21d21a105501b839386d24a59a144be054eafaaa0d"} Apr 21 04:05:39.667647 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:39.667622 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzw7n" Apr 21 04:05:39.751384 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:39.751343 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/edfbb979-22f1-4a0d-ab4f-cff09e420a92-util\") pod \"edfbb979-22f1-4a0d-ab4f-cff09e420a92\" (UID: \"edfbb979-22f1-4a0d-ab4f-cff09e420a92\") " Apr 21 04:05:39.751586 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:39.751439 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/edfbb979-22f1-4a0d-ab4f-cff09e420a92-bundle\") pod \"edfbb979-22f1-4a0d-ab4f-cff09e420a92\" (UID: \"edfbb979-22f1-4a0d-ab4f-cff09e420a92\") " Apr 21 04:05:39.751586 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:39.751494 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsxww\" (UniqueName: \"kubernetes.io/projected/edfbb979-22f1-4a0d-ab4f-cff09e420a92-kube-api-access-xsxww\") pod \"edfbb979-22f1-4a0d-ab4f-cff09e420a92\" (UID: \"edfbb979-22f1-4a0d-ab4f-cff09e420a92\") " Apr 21 04:05:39.751861 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:39.751826 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edfbb979-22f1-4a0d-ab4f-cff09e420a92-bundle" (OuterVolumeSpecName: "bundle") pod "edfbb979-22f1-4a0d-ab4f-cff09e420a92" (UID: "edfbb979-22f1-4a0d-ab4f-cff09e420a92"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:05:39.753643 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:39.753616 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edfbb979-22f1-4a0d-ab4f-cff09e420a92-kube-api-access-xsxww" (OuterVolumeSpecName: "kube-api-access-xsxww") pod "edfbb979-22f1-4a0d-ab4f-cff09e420a92" (UID: "edfbb979-22f1-4a0d-ab4f-cff09e420a92"). InnerVolumeSpecName "kube-api-access-xsxww". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:05:39.756657 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:39.756624 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edfbb979-22f1-4a0d-ab4f-cff09e420a92-util" (OuterVolumeSpecName: "util") pod "edfbb979-22f1-4a0d-ab4f-cff09e420a92" (UID: "edfbb979-22f1-4a0d-ab4f-cff09e420a92"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:05:39.852148 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:39.852055 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xsxww\" (UniqueName: \"kubernetes.io/projected/edfbb979-22f1-4a0d-ab4f-cff09e420a92-kube-api-access-xsxww\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:05:39.852148 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:39.852087 2578 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/edfbb979-22f1-4a0d-ab4f-cff09e420a92-util\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:05:39.852148 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:39.852097 2578 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/edfbb979-22f1-4a0d-ab4f-cff09e420a92-bundle\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:05:40.542817 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:40.542786 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzw7n" Apr 21 04:05:40.542817 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:40.542803 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fqzw7n" event={"ID":"edfbb979-22f1-4a0d-ab4f-cff09e420a92","Type":"ContainerDied","Data":"f73ccce12363eb8d94fd1a37b0e3419d46577b6a01742fb0b32b51e73c58d73a"} Apr 21 04:05:40.543023 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:05:40.542831 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f73ccce12363eb8d94fd1a37b0e3419d46577b6a01742fb0b32b51e73c58d73a" Apr 21 04:06:01.233323 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:01.233273 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358r6br"] Apr 21 04:06:01.233800 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:01.233644 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="edfbb979-22f1-4a0d-ab4f-cff09e420a92" containerName="extract" Apr 21 04:06:01.233800 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:01.233656 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="edfbb979-22f1-4a0d-ab4f-cff09e420a92" containerName="extract" Apr 21 04:06:01.233800 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:01.233676 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="edfbb979-22f1-4a0d-ab4f-cff09e420a92" containerName="util" Apr 21 04:06:01.233800 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:01.233682 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="edfbb979-22f1-4a0d-ab4f-cff09e420a92" containerName="util" Apr 21 04:06:01.233800 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:01.233689 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="edfbb979-22f1-4a0d-ab4f-cff09e420a92" containerName="pull" Apr 21 04:06:01.233800 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:01.233694 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="edfbb979-22f1-4a0d-ab4f-cff09e420a92" containerName="pull" Apr 21 04:06:01.233800 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:01.233754 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="edfbb979-22f1-4a0d-ab4f-cff09e420a92" containerName="extract" Apr 21 04:06:01.236857 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:01.236839 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358r6br" Apr 21 04:06:01.239353 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:01.239329 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 04:06:01.239475 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:01.239380 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-k6n2p\"" Apr 21 04:06:01.240465 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:01.240440 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 04:06:01.243252 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:01.243233 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358r6br"] Apr 21 04:06:01.333107 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:01.333068 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8e2e6581-87ad-4168-bd42-9180286b2bbc-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358r6br\" (UID: \"8e2e6581-87ad-4168-bd42-9180286b2bbc\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358r6br" Apr 21 04:06:01.333323 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:01.333136 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8e2e6581-87ad-4168-bd42-9180286b2bbc-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358r6br\" (UID: \"8e2e6581-87ad-4168-bd42-9180286b2bbc\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358r6br" Apr 21 04:06:01.333323 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:01.333180 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crwj2\" (UniqueName: \"kubernetes.io/projected/8e2e6581-87ad-4168-bd42-9180286b2bbc-kube-api-access-crwj2\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358r6br\" (UID: \"8e2e6581-87ad-4168-bd42-9180286b2bbc\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358r6br" Apr 21 04:06:01.433935 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:01.433898 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-crwj2\" (UniqueName: \"kubernetes.io/projected/8e2e6581-87ad-4168-bd42-9180286b2bbc-kube-api-access-crwj2\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358r6br\" (UID: \"8e2e6581-87ad-4168-bd42-9180286b2bbc\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358r6br" Apr 21 04:06:01.434109 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:01.433976 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8e2e6581-87ad-4168-bd42-9180286b2bbc-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358r6br\" (UID: \"8e2e6581-87ad-4168-bd42-9180286b2bbc\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358r6br" Apr 21 04:06:01.434109 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:01.434028 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8e2e6581-87ad-4168-bd42-9180286b2bbc-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358r6br\" (UID: \"8e2e6581-87ad-4168-bd42-9180286b2bbc\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358r6br" Apr 21 04:06:01.434407 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:01.434383 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8e2e6581-87ad-4168-bd42-9180286b2bbc-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358r6br\" (UID: \"8e2e6581-87ad-4168-bd42-9180286b2bbc\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358r6br" Apr 21 04:06:01.434458 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:01.434412 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8e2e6581-87ad-4168-bd42-9180286b2bbc-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358r6br\" (UID: \"8e2e6581-87ad-4168-bd42-9180286b2bbc\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358r6br" Apr 21 04:06:01.441910 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:01.441885 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-crwj2\" (UniqueName: \"kubernetes.io/projected/8e2e6581-87ad-4168-bd42-9180286b2bbc-kube-api-access-crwj2\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358r6br\" (UID: \"8e2e6581-87ad-4168-bd42-9180286b2bbc\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358r6br" Apr 21 04:06:01.546472 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:01.546434 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358r6br" Apr 21 04:06:01.670439 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:01.670413 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358r6br"] Apr 21 04:06:01.672912 ip-10-0-134-136 kubenswrapper[2578]: W0421 04:06:01.672872 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e2e6581_87ad_4168_bd42_9180286b2bbc.slice/crio-990ae94ad167b1697966edeaeaae04961f12c20a535e3620d6ba4e4a384b9c08 WatchSource:0}: Error finding container 990ae94ad167b1697966edeaeaae04961f12c20a535e3620d6ba4e4a384b9c08: Status 404 returned error can't find the container with id 990ae94ad167b1697966edeaeaae04961f12c20a535e3620d6ba4e4a384b9c08 Apr 21 04:06:02.618507 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:02.618475 2578 generic.go:358] "Generic (PLEG): container finished" podID="8e2e6581-87ad-4168-bd42-9180286b2bbc" containerID="8bd3d6e8abadc8283d0be99b02411fbfebc019fd08e16114ac99c7877fda7f60" exitCode=0 Apr 21 04:06:02.618881 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:02.618574 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358r6br" event={"ID":"8e2e6581-87ad-4168-bd42-9180286b2bbc","Type":"ContainerDied","Data":"8bd3d6e8abadc8283d0be99b02411fbfebc019fd08e16114ac99c7877fda7f60"} Apr 21 04:06:02.618881 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:02.618614 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358r6br" event={"ID":"8e2e6581-87ad-4168-bd42-9180286b2bbc","Type":"ContainerStarted","Data":"990ae94ad167b1697966edeaeaae04961f12c20a535e3620d6ba4e4a384b9c08"} Apr 21 04:06:03.623172 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:03.623133 2578 generic.go:358] "Generic (PLEG): container finished" podID="8e2e6581-87ad-4168-bd42-9180286b2bbc" containerID="aa7be4c2a232005c86d87890fcbe70e65057306852d6eeed793c0e6306771ea7" exitCode=0 Apr 21 04:06:03.623591 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:03.623222 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358r6br" event={"ID":"8e2e6581-87ad-4168-bd42-9180286b2bbc","Type":"ContainerDied","Data":"aa7be4c2a232005c86d87890fcbe70e65057306852d6eeed793c0e6306771ea7"} Apr 21 04:06:04.628718 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:04.628685 2578 generic.go:358] "Generic (PLEG): container finished" podID="8e2e6581-87ad-4168-bd42-9180286b2bbc" containerID="ba72bc0db56d78929a9f1a6394b1f7ae5f513959d85ea6f3f06e4d1ba25ed547" exitCode=0 Apr 21 04:06:04.629091 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:04.628777 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358r6br" event={"ID":"8e2e6581-87ad-4168-bd42-9180286b2bbc","Type":"ContainerDied","Data":"ba72bc0db56d78929a9f1a6394b1f7ae5f513959d85ea6f3f06e4d1ba25ed547"} Apr 21 04:06:05.755941 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:05.755918 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358r6br" Apr 21 04:06:05.769525 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:05.769492 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8e2e6581-87ad-4168-bd42-9180286b2bbc-util\") pod \"8e2e6581-87ad-4168-bd42-9180286b2bbc\" (UID: \"8e2e6581-87ad-4168-bd42-9180286b2bbc\") " Apr 21 04:06:05.769711 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:05.769569 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8e2e6581-87ad-4168-bd42-9180286b2bbc-bundle\") pod \"8e2e6581-87ad-4168-bd42-9180286b2bbc\" (UID: \"8e2e6581-87ad-4168-bd42-9180286b2bbc\") " Apr 21 04:06:05.769711 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:05.769601 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crwj2\" (UniqueName: \"kubernetes.io/projected/8e2e6581-87ad-4168-bd42-9180286b2bbc-kube-api-access-crwj2\") pod \"8e2e6581-87ad-4168-bd42-9180286b2bbc\" (UID: \"8e2e6581-87ad-4168-bd42-9180286b2bbc\") " Apr 21 04:06:05.770442 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:05.770413 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e2e6581-87ad-4168-bd42-9180286b2bbc-bundle" (OuterVolumeSpecName: "bundle") pod "8e2e6581-87ad-4168-bd42-9180286b2bbc" (UID: "8e2e6581-87ad-4168-bd42-9180286b2bbc"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:06:05.771859 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:05.771837 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e2e6581-87ad-4168-bd42-9180286b2bbc-kube-api-access-crwj2" (OuterVolumeSpecName: "kube-api-access-crwj2") pod "8e2e6581-87ad-4168-bd42-9180286b2bbc" (UID: "8e2e6581-87ad-4168-bd42-9180286b2bbc"). InnerVolumeSpecName "kube-api-access-crwj2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:06:05.778743 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:05.778711 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e2e6581-87ad-4168-bd42-9180286b2bbc-util" (OuterVolumeSpecName: "util") pod "8e2e6581-87ad-4168-bd42-9180286b2bbc" (UID: "8e2e6581-87ad-4168-bd42-9180286b2bbc"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:06:05.870762 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:05.870722 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-crwj2\" (UniqueName: \"kubernetes.io/projected/8e2e6581-87ad-4168-bd42-9180286b2bbc-kube-api-access-crwj2\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:06:05.870762 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:05.870754 2578 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8e2e6581-87ad-4168-bd42-9180286b2bbc-util\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:06:05.870762 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:05.870764 2578 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8e2e6581-87ad-4168-bd42-9180286b2bbc-bundle\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:06:06.637793 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:06.637765 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358r6br" Apr 21 04:06:06.637967 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:06.637766 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358r6br" event={"ID":"8e2e6581-87ad-4168-bd42-9180286b2bbc","Type":"ContainerDied","Data":"990ae94ad167b1697966edeaeaae04961f12c20a535e3620d6ba4e4a384b9c08"} Apr 21 04:06:06.637967 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:06.637875 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="990ae94ad167b1697966edeaeaae04961f12c20a535e3620d6ba4e4a384b9c08" Apr 21 04:06:15.743809 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:15.743777 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c247qx6"] Apr 21 04:06:15.744226 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:15.744146 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e2e6581-87ad-4168-bd42-9180286b2bbc" containerName="pull" Apr 21 04:06:15.744226 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:15.744157 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e2e6581-87ad-4168-bd42-9180286b2bbc" containerName="pull" Apr 21 04:06:15.744226 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:15.744166 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e2e6581-87ad-4168-bd42-9180286b2bbc" containerName="extract" Apr 21 04:06:15.744226 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:15.744172 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e2e6581-87ad-4168-bd42-9180286b2bbc" containerName="extract" Apr 21 04:06:15.744226 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:15.744183 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e2e6581-87ad-4168-bd42-9180286b2bbc" containerName="util" Apr 21 04:06:15.744226 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:15.744189 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e2e6581-87ad-4168-bd42-9180286b2bbc" containerName="util" Apr 21 04:06:15.744436 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:15.744244 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="8e2e6581-87ad-4168-bd42-9180286b2bbc" containerName="extract" Apr 21 04:06:15.748544 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:15.748522 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c247qx6" Apr 21 04:06:15.750827 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:15.750800 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 04:06:15.752111 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:15.752086 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 04:06:15.752296 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:15.752249 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-k6n2p\"" Apr 21 04:06:15.754995 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:15.754972 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c247qx6"] Apr 21 04:06:15.849563 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:15.849527 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdtpt\" (UniqueName: \"kubernetes.io/projected/4f3e1424-b252-4210-a6a0-ea6a865efa29-kube-api-access-gdtpt\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c247qx6\" (UID: \"4f3e1424-b252-4210-a6a0-ea6a865efa29\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c247qx6" Apr 21 04:06:15.849725 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:15.849577 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4f3e1424-b252-4210-a6a0-ea6a865efa29-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c247qx6\" (UID: \"4f3e1424-b252-4210-a6a0-ea6a865efa29\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c247qx6" Apr 21 04:06:15.849725 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:15.849652 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4f3e1424-b252-4210-a6a0-ea6a865efa29-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c247qx6\" (UID: \"4f3e1424-b252-4210-a6a0-ea6a865efa29\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c247qx6" Apr 21 04:06:15.950162 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:15.950118 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gdtpt\" (UniqueName: \"kubernetes.io/projected/4f3e1424-b252-4210-a6a0-ea6a865efa29-kube-api-access-gdtpt\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c247qx6\" (UID: \"4f3e1424-b252-4210-a6a0-ea6a865efa29\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c247qx6" Apr 21 04:06:15.950379 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:15.950181 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4f3e1424-b252-4210-a6a0-ea6a865efa29-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c247qx6\" (UID: \"4f3e1424-b252-4210-a6a0-ea6a865efa29\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c247qx6" Apr 21 04:06:15.950379 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:15.950245 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4f3e1424-b252-4210-a6a0-ea6a865efa29-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c247qx6\" (UID: \"4f3e1424-b252-4210-a6a0-ea6a865efa29\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c247qx6" Apr 21 04:06:15.950623 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:15.950604 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4f3e1424-b252-4210-a6a0-ea6a865efa29-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c247qx6\" (UID: \"4f3e1424-b252-4210-a6a0-ea6a865efa29\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c247qx6" Apr 21 04:06:15.950659 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:15.950631 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4f3e1424-b252-4210-a6a0-ea6a865efa29-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c247qx6\" (UID: \"4f3e1424-b252-4210-a6a0-ea6a865efa29\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c247qx6" Apr 21 04:06:15.961471 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:15.961444 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdtpt\" (UniqueName: \"kubernetes.io/projected/4f3e1424-b252-4210-a6a0-ea6a865efa29-kube-api-access-gdtpt\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c247qx6\" (UID: \"4f3e1424-b252-4210-a6a0-ea6a865efa29\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c247qx6" Apr 21 04:06:16.058644 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:16.058606 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c247qx6" Apr 21 04:06:16.180086 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:16.180062 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c247qx6"] Apr 21 04:06:16.182211 ip-10-0-134-136 kubenswrapper[2578]: W0421 04:06:16.182179 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f3e1424_b252_4210_a6a0_ea6a865efa29.slice/crio-475c23e35b19f28e5419a5b7de6c29c0a30497e3bf43903390123cd6fe77b588 WatchSource:0}: Error finding container 475c23e35b19f28e5419a5b7de6c29c0a30497e3bf43903390123cd6fe77b588: Status 404 returned error can't find the container with id 475c23e35b19f28e5419a5b7de6c29c0a30497e3bf43903390123cd6fe77b588 Apr 21 04:06:16.677335 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:16.677298 2578 generic.go:358] "Generic (PLEG): container finished" podID="4f3e1424-b252-4210-a6a0-ea6a865efa29" containerID="39ece53c76641aaa40814a2dae1eee86a778aad979db7a9d8356ef3f3b301ab5" exitCode=0 Apr 21 04:06:16.677527 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:16.677370 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c247qx6" event={"ID":"4f3e1424-b252-4210-a6a0-ea6a865efa29","Type":"ContainerDied","Data":"39ece53c76641aaa40814a2dae1eee86a778aad979db7a9d8356ef3f3b301ab5"} Apr 21 04:06:16.677527 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:16.677404 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c247qx6" event={"ID":"4f3e1424-b252-4210-a6a0-ea6a865efa29","Type":"ContainerStarted","Data":"475c23e35b19f28e5419a5b7de6c29c0a30497e3bf43903390123cd6fe77b588"} Apr 21 04:06:17.682323 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:17.682297 2578 generic.go:358] "Generic (PLEG): container finished" podID="4f3e1424-b252-4210-a6a0-ea6a865efa29" containerID="26c9b05325cc9ad221f7ae4b7ee5a4be3f1b2df9308cbb1f9f93aca92a144fa3" exitCode=0 Apr 21 04:06:17.682621 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:17.682332 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c247qx6" event={"ID":"4f3e1424-b252-4210-a6a0-ea6a865efa29","Type":"ContainerDied","Data":"26c9b05325cc9ad221f7ae4b7ee5a4be3f1b2df9308cbb1f9f93aca92a144fa3"} Apr 21 04:06:17.972862 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:17.972826 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-5s9br"] Apr 21 04:06:17.976138 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:17.976119 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-5s9br" Apr 21 04:06:17.981539 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:17.981514 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 21 04:06:17.981666 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:17.981563 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-rtdwb\"" Apr 21 04:06:17.981666 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:17.981563 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 21 04:06:17.988813 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:17.988792 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-5s9br"] Apr 21 04:06:18.069626 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:18.069583 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v5db\" (UniqueName: \"kubernetes.io/projected/9830998c-2fcd-43a5-a399-588ac195420a-kube-api-access-8v5db\") pod \"servicemesh-operator3-55f49c5f94-5s9br\" (UID: \"9830998c-2fcd-43a5-a399-588ac195420a\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-5s9br" Apr 21 04:06:18.069816 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:18.069703 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/9830998c-2fcd-43a5-a399-588ac195420a-operator-config\") pod \"servicemesh-operator3-55f49c5f94-5s9br\" (UID: \"9830998c-2fcd-43a5-a399-588ac195420a\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-5s9br" Apr 21 04:06:18.170804 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:18.170721 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/9830998c-2fcd-43a5-a399-588ac195420a-operator-config\") pod \"servicemesh-operator3-55f49c5f94-5s9br\" (UID: \"9830998c-2fcd-43a5-a399-588ac195420a\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-5s9br" Apr 21 04:06:18.170804 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:18.170774 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8v5db\" (UniqueName: \"kubernetes.io/projected/9830998c-2fcd-43a5-a399-588ac195420a-kube-api-access-8v5db\") pod \"servicemesh-operator3-55f49c5f94-5s9br\" (UID: \"9830998c-2fcd-43a5-a399-588ac195420a\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-5s9br" Apr 21 04:06:18.173260 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:18.173234 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/9830998c-2fcd-43a5-a399-588ac195420a-operator-config\") pod \"servicemesh-operator3-55f49c5f94-5s9br\" (UID: \"9830998c-2fcd-43a5-a399-588ac195420a\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-5s9br" Apr 21 04:06:18.181858 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:18.181831 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v5db\" (UniqueName: \"kubernetes.io/projected/9830998c-2fcd-43a5-a399-588ac195420a-kube-api-access-8v5db\") pod \"servicemesh-operator3-55f49c5f94-5s9br\" (UID: \"9830998c-2fcd-43a5-a399-588ac195420a\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-5s9br" Apr 21 04:06:18.285103 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:18.285068 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-5s9br" Apr 21 04:06:18.408407 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:18.408381 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-5s9br"] Apr 21 04:06:18.410226 ip-10-0-134-136 kubenswrapper[2578]: W0421 04:06:18.410199 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9830998c_2fcd_43a5_a399_588ac195420a.slice/crio-df3311d59240a1d435c2d6f54c59d89f15fb672949256ee7219d48fd5b00d0a4 WatchSource:0}: Error finding container df3311d59240a1d435c2d6f54c59d89f15fb672949256ee7219d48fd5b00d0a4: Status 404 returned error can't find the container with id df3311d59240a1d435c2d6f54c59d89f15fb672949256ee7219d48fd5b00d0a4 Apr 21 04:06:18.686960 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:18.686930 2578 generic.go:358] "Generic (PLEG): container finished" podID="4f3e1424-b252-4210-a6a0-ea6a865efa29" containerID="2fadf7036966fd316f7d7cae02446c169e7035c9925b76f8bbccbaa3f3f5c294" exitCode=0 Apr 21 04:06:18.687456 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:18.687001 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c247qx6" event={"ID":"4f3e1424-b252-4210-a6a0-ea6a865efa29","Type":"ContainerDied","Data":"2fadf7036966fd316f7d7cae02446c169e7035c9925b76f8bbccbaa3f3f5c294"} Apr 21 04:06:18.688237 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:18.688207 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-5s9br" event={"ID":"9830998c-2fcd-43a5-a399-588ac195420a","Type":"ContainerStarted","Data":"df3311d59240a1d435c2d6f54c59d89f15fb672949256ee7219d48fd5b00d0a4"} Apr 21 04:06:19.847302 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:19.847252 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c247qx6" Apr 21 04:06:19.888773 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:19.888737 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4f3e1424-b252-4210-a6a0-ea6a865efa29-util\") pod \"4f3e1424-b252-4210-a6a0-ea6a865efa29\" (UID: \"4f3e1424-b252-4210-a6a0-ea6a865efa29\") " Apr 21 04:06:19.888969 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:19.888807 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4f3e1424-b252-4210-a6a0-ea6a865efa29-bundle\") pod \"4f3e1424-b252-4210-a6a0-ea6a865efa29\" (UID: \"4f3e1424-b252-4210-a6a0-ea6a865efa29\") " Apr 21 04:06:19.888969 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:19.888836 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdtpt\" (UniqueName: \"kubernetes.io/projected/4f3e1424-b252-4210-a6a0-ea6a865efa29-kube-api-access-gdtpt\") pod \"4f3e1424-b252-4210-a6a0-ea6a865efa29\" (UID: \"4f3e1424-b252-4210-a6a0-ea6a865efa29\") " Apr 21 04:06:19.889680 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:19.889653 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f3e1424-b252-4210-a6a0-ea6a865efa29-bundle" (OuterVolumeSpecName: "bundle") pod "4f3e1424-b252-4210-a6a0-ea6a865efa29" (UID: "4f3e1424-b252-4210-a6a0-ea6a865efa29"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:06:19.890904 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:19.890883 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f3e1424-b252-4210-a6a0-ea6a865efa29-kube-api-access-gdtpt" (OuterVolumeSpecName: "kube-api-access-gdtpt") pod "4f3e1424-b252-4210-a6a0-ea6a865efa29" (UID: "4f3e1424-b252-4210-a6a0-ea6a865efa29"). InnerVolumeSpecName "kube-api-access-gdtpt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:06:19.894176 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:19.894140 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f3e1424-b252-4210-a6a0-ea6a865efa29-util" (OuterVolumeSpecName: "util") pod "4f3e1424-b252-4210-a6a0-ea6a865efa29" (UID: "4f3e1424-b252-4210-a6a0-ea6a865efa29"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:06:19.989575 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:19.989504 2578 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4f3e1424-b252-4210-a6a0-ea6a865efa29-util\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:06:19.989575 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:19.989538 2578 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4f3e1424-b252-4210-a6a0-ea6a865efa29-bundle\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:06:19.989575 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:19.989552 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gdtpt\" (UniqueName: \"kubernetes.io/projected/4f3e1424-b252-4210-a6a0-ea6a865efa29-kube-api-access-gdtpt\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:06:20.700293 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:20.700253 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c247qx6" event={"ID":"4f3e1424-b252-4210-a6a0-ea6a865efa29","Type":"ContainerDied","Data":"475c23e35b19f28e5419a5b7de6c29c0a30497e3bf43903390123cd6fe77b588"} Apr 21 04:06:20.700489 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:20.700305 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="475c23e35b19f28e5419a5b7de6c29c0a30497e3bf43903390123cd6fe77b588" Apr 21 04:06:20.700489 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:20.700316 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c247qx6" Apr 21 04:06:21.707063 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:21.707011 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-5s9br" event={"ID":"9830998c-2fcd-43a5-a399-588ac195420a","Type":"ContainerStarted","Data":"33ab88bf72c308144212299ee1eb5e8b6cc7e6d9f4744382537c4d4261efb4ef"} Apr 21 04:06:21.707742 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:21.707719 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-5s9br" Apr 21 04:06:21.729131 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:21.729078 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-5s9br" podStartSLOduration=2.288949344 podStartE2EDuration="4.729063393s" podCreationTimestamp="2026-04-21 04:06:17 +0000 UTC" firstStartedPulling="2026-04-21 04:06:18.412924968 +0000 UTC m=+522.082364022" lastFinishedPulling="2026-04-21 04:06:20.853039018 +0000 UTC m=+524.522478071" observedRunningTime="2026-04-21 04:06:21.726383303 +0000 UTC m=+525.395822401" watchObservedRunningTime="2026-04-21 04:06:21.729063393 +0000 UTC m=+525.398502469" Apr 21 04:06:26.078509 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:26.078474 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gzshw"] Apr 21 04:06:26.078883 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:26.078855 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f3e1424-b252-4210-a6a0-ea6a865efa29" containerName="util" Apr 21 04:06:26.078883 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:26.078866 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f3e1424-b252-4210-a6a0-ea6a865efa29" containerName="util" Apr 21 04:06:26.078883 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:26.078880 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f3e1424-b252-4210-a6a0-ea6a865efa29" containerName="extract" Apr 21 04:06:26.078883 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:26.078885 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f3e1424-b252-4210-a6a0-ea6a865efa29" containerName="extract" Apr 21 04:06:26.079021 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:26.078895 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f3e1424-b252-4210-a6a0-ea6a865efa29" containerName="pull" Apr 21 04:06:26.079021 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:26.078901 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f3e1424-b252-4210-a6a0-ea6a865efa29" containerName="pull" Apr 21 04:06:26.079021 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:26.078958 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="4f3e1424-b252-4210-a6a0-ea6a865efa29" containerName="extract" Apr 21 04:06:26.082173 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:26.082151 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gzshw" Apr 21 04:06:26.084734 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:26.084703 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 21 04:06:26.084907 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:26.084765 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 21 04:06:26.084907 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:26.084799 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-h8ts4\"" Apr 21 04:06:26.084907 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:26.084836 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 21 04:06:26.085073 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:26.085043 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 21 04:06:26.096612 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:26.096587 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gzshw"] Apr 21 04:06:26.139336 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:26.139298 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/841ef081-9046-47bb-8f1e-7b662ff2b695-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-gzshw\" (UID: \"841ef081-9046-47bb-8f1e-7b662ff2b695\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gzshw" Apr 21 04:06:26.139492 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:26.139347 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/841ef081-9046-47bb-8f1e-7b662ff2b695-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-gzshw\" (UID: \"841ef081-9046-47bb-8f1e-7b662ff2b695\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gzshw" Apr 21 04:06:26.139492 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:26.139365 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlxjv\" (UniqueName: \"kubernetes.io/projected/841ef081-9046-47bb-8f1e-7b662ff2b695-kube-api-access-hlxjv\") pod \"istiod-openshift-gateway-7cd77c7ffd-gzshw\" (UID: \"841ef081-9046-47bb-8f1e-7b662ff2b695\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gzshw" Apr 21 04:06:26.139492 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:26.139409 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/841ef081-9046-47bb-8f1e-7b662ff2b695-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-gzshw\" (UID: \"841ef081-9046-47bb-8f1e-7b662ff2b695\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gzshw" Apr 21 04:06:26.139639 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:26.139514 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/841ef081-9046-47bb-8f1e-7b662ff2b695-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-gzshw\" (UID: \"841ef081-9046-47bb-8f1e-7b662ff2b695\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gzshw" Apr 21 04:06:26.139639 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:26.139549 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/841ef081-9046-47bb-8f1e-7b662ff2b695-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-gzshw\" (UID: \"841ef081-9046-47bb-8f1e-7b662ff2b695\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gzshw" Apr 21 04:06:26.139639 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:26.139571 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/841ef081-9046-47bb-8f1e-7b662ff2b695-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-gzshw\" (UID: \"841ef081-9046-47bb-8f1e-7b662ff2b695\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gzshw" Apr 21 04:06:26.240559 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:26.240523 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/841ef081-9046-47bb-8f1e-7b662ff2b695-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-gzshw\" (UID: \"841ef081-9046-47bb-8f1e-7b662ff2b695\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gzshw" Apr 21 04:06:26.240559 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:26.240561 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/841ef081-9046-47bb-8f1e-7b662ff2b695-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-gzshw\" (UID: \"841ef081-9046-47bb-8f1e-7b662ff2b695\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gzshw" Apr 21 04:06:26.240819 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:26.240579 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hlxjv\" (UniqueName: \"kubernetes.io/projected/841ef081-9046-47bb-8f1e-7b662ff2b695-kube-api-access-hlxjv\") pod \"istiod-openshift-gateway-7cd77c7ffd-gzshw\" (UID: \"841ef081-9046-47bb-8f1e-7b662ff2b695\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gzshw" Apr 21 04:06:26.240819 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:26.240615 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/841ef081-9046-47bb-8f1e-7b662ff2b695-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-gzshw\" (UID: \"841ef081-9046-47bb-8f1e-7b662ff2b695\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gzshw" Apr 21 04:06:26.240819 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:26.240650 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/841ef081-9046-47bb-8f1e-7b662ff2b695-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-gzshw\" (UID: \"841ef081-9046-47bb-8f1e-7b662ff2b695\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gzshw" Apr 21 04:06:26.240819 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:26.240680 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/841ef081-9046-47bb-8f1e-7b662ff2b695-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-gzshw\" (UID: \"841ef081-9046-47bb-8f1e-7b662ff2b695\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gzshw" Apr 21 04:06:26.240819 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:26.240706 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/841ef081-9046-47bb-8f1e-7b662ff2b695-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-gzshw\" (UID: \"841ef081-9046-47bb-8f1e-7b662ff2b695\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gzshw" Apr 21 04:06:26.241423 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:26.241400 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/841ef081-9046-47bb-8f1e-7b662ff2b695-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-gzshw\" (UID: \"841ef081-9046-47bb-8f1e-7b662ff2b695\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gzshw" Apr 21 04:06:26.243148 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:26.243126 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/841ef081-9046-47bb-8f1e-7b662ff2b695-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-gzshw\" (UID: \"841ef081-9046-47bb-8f1e-7b662ff2b695\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gzshw" Apr 21 04:06:26.243434 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:26.243409 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/841ef081-9046-47bb-8f1e-7b662ff2b695-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-gzshw\" (UID: \"841ef081-9046-47bb-8f1e-7b662ff2b695\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gzshw" Apr 21 04:06:26.243547 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:26.243533 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/841ef081-9046-47bb-8f1e-7b662ff2b695-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-gzshw\" (UID: \"841ef081-9046-47bb-8f1e-7b662ff2b695\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gzshw" Apr 21 04:06:26.243664 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:26.243641 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/841ef081-9046-47bb-8f1e-7b662ff2b695-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-gzshw\" (UID: \"841ef081-9046-47bb-8f1e-7b662ff2b695\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gzshw" Apr 21 04:06:26.248368 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:26.248345 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/841ef081-9046-47bb-8f1e-7b662ff2b695-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-gzshw\" (UID: \"841ef081-9046-47bb-8f1e-7b662ff2b695\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gzshw" Apr 21 04:06:26.248772 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:26.248752 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlxjv\" (UniqueName: \"kubernetes.io/projected/841ef081-9046-47bb-8f1e-7b662ff2b695-kube-api-access-hlxjv\") pod \"istiod-openshift-gateway-7cd77c7ffd-gzshw\" (UID: \"841ef081-9046-47bb-8f1e-7b662ff2b695\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gzshw" Apr 21 04:06:26.391637 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:26.391546 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gzshw" Apr 21 04:06:26.533739 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:26.533716 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gzshw"] Apr 21 04:06:26.535691 ip-10-0-134-136 kubenswrapper[2578]: W0421 04:06:26.535654 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod841ef081_9046_47bb_8f1e_7b662ff2b695.slice/crio-9ec919273ddf5126be4c4c34fa3b7a1de7ff20d203a8b9e713aca9ba48e9e87f WatchSource:0}: Error finding container 9ec919273ddf5126be4c4c34fa3b7a1de7ff20d203a8b9e713aca9ba48e9e87f: Status 404 returned error can't find the container with id 9ec919273ddf5126be4c4c34fa3b7a1de7ff20d203a8b9e713aca9ba48e9e87f Apr 21 04:06:26.725861 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:26.725776 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gzshw" event={"ID":"841ef081-9046-47bb-8f1e-7b662ff2b695","Type":"ContainerStarted","Data":"9ec919273ddf5126be4c4c34fa3b7a1de7ff20d203a8b9e713aca9ba48e9e87f"} Apr 21 04:06:29.221320 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:29.221268 2578 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 21 04:06:29.221571 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:29.221351 2578 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 21 04:06:29.738703 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:29.738670 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gzshw" event={"ID":"841ef081-9046-47bb-8f1e-7b662ff2b695","Type":"ContainerStarted","Data":"9c59527d915a0d958f693adcc3dc1cafe8837a7010b9128a5c59be1c21ea7edc"} Apr 21 04:06:29.738868 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:29.738808 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gzshw" Apr 21 04:06:29.759556 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:29.759499 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gzshw" podStartSLOduration=1.076983877 podStartE2EDuration="3.759481968s" podCreationTimestamp="2026-04-21 04:06:26 +0000 UTC" firstStartedPulling="2026-04-21 04:06:26.538539836 +0000 UTC m=+530.207978888" lastFinishedPulling="2026-04-21 04:06:29.221037912 +0000 UTC m=+532.890476979" observedRunningTime="2026-04-21 04:06:29.757165329 +0000 UTC m=+533.426604405" watchObservedRunningTime="2026-04-21 04:06:29.759481968 +0000 UTC m=+533.428921045" Apr 21 04:06:30.743815 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:30.743789 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-gzshw" Apr 21 04:06:32.597091 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:32.597050 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-j9s6j"] Apr 21 04:06:32.601062 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:32.601045 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-j9s6j" Apr 21 04:06:32.603823 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:32.603801 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"openshift-ai-inference-openshift-default-dockercfg-wsf79\"" Apr 21 04:06:32.609316 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:32.609272 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-j9s6j"] Apr 21 04:06:32.694437 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:32.694398 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/0cf4f637-103b-4401-9d5e-1ab49201634c-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-j9s6j\" (UID: \"0cf4f637-103b-4401-9d5e-1ab49201634c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-j9s6j" Apr 21 04:06:32.694437 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:32.694441 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/0cf4f637-103b-4401-9d5e-1ab49201634c-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-j9s6j\" (UID: \"0cf4f637-103b-4401-9d5e-1ab49201634c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-j9s6j" Apr 21 04:06:32.694647 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:32.694462 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/0cf4f637-103b-4401-9d5e-1ab49201634c-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-j9s6j\" (UID: \"0cf4f637-103b-4401-9d5e-1ab49201634c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-j9s6j" Apr 21 04:06:32.694647 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:32.694586 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/0cf4f637-103b-4401-9d5e-1ab49201634c-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-j9s6j\" (UID: \"0cf4f637-103b-4401-9d5e-1ab49201634c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-j9s6j" Apr 21 04:06:32.694647 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:32.694634 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/0cf4f637-103b-4401-9d5e-1ab49201634c-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-j9s6j\" (UID: \"0cf4f637-103b-4401-9d5e-1ab49201634c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-j9s6j" Apr 21 04:06:32.694761 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:32.694654 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/0cf4f637-103b-4401-9d5e-1ab49201634c-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-j9s6j\" (UID: \"0cf4f637-103b-4401-9d5e-1ab49201634c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-j9s6j" Apr 21 04:06:32.694761 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:32.694673 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79d7q\" (UniqueName: \"kubernetes.io/projected/0cf4f637-103b-4401-9d5e-1ab49201634c-kube-api-access-79d7q\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-j9s6j\" (UID: \"0cf4f637-103b-4401-9d5e-1ab49201634c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-j9s6j" Apr 21 04:06:32.694761 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:32.694706 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/0cf4f637-103b-4401-9d5e-1ab49201634c-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-j9s6j\" (UID: \"0cf4f637-103b-4401-9d5e-1ab49201634c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-j9s6j" Apr 21 04:06:32.694872 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:32.694774 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/0cf4f637-103b-4401-9d5e-1ab49201634c-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-j9s6j\" (UID: \"0cf4f637-103b-4401-9d5e-1ab49201634c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-j9s6j" Apr 21 04:06:32.795234 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:32.795192 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/0cf4f637-103b-4401-9d5e-1ab49201634c-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-j9s6j\" (UID: \"0cf4f637-103b-4401-9d5e-1ab49201634c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-j9s6j" Apr 21 04:06:32.795234 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:32.795230 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/0cf4f637-103b-4401-9d5e-1ab49201634c-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-j9s6j\" (UID: \"0cf4f637-103b-4401-9d5e-1ab49201634c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-j9s6j" Apr 21 04:06:32.795488 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:32.795249 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/0cf4f637-103b-4401-9d5e-1ab49201634c-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-j9s6j\" (UID: \"0cf4f637-103b-4401-9d5e-1ab49201634c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-j9s6j" Apr 21 04:06:32.795488 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:32.795265 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-79d7q\" (UniqueName: \"kubernetes.io/projected/0cf4f637-103b-4401-9d5e-1ab49201634c-kube-api-access-79d7q\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-j9s6j\" (UID: \"0cf4f637-103b-4401-9d5e-1ab49201634c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-j9s6j" Apr 21 04:06:32.795488 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:32.795327 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/0cf4f637-103b-4401-9d5e-1ab49201634c-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-j9s6j\" (UID: \"0cf4f637-103b-4401-9d5e-1ab49201634c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-j9s6j" Apr 21 04:06:32.795488 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:32.795381 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/0cf4f637-103b-4401-9d5e-1ab49201634c-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-j9s6j\" (UID: \"0cf4f637-103b-4401-9d5e-1ab49201634c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-j9s6j" Apr 21 04:06:32.795488 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:32.795441 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/0cf4f637-103b-4401-9d5e-1ab49201634c-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-j9s6j\" (UID: \"0cf4f637-103b-4401-9d5e-1ab49201634c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-j9s6j" Apr 21 04:06:32.795488 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:32.795468 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/0cf4f637-103b-4401-9d5e-1ab49201634c-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-j9s6j\" (UID: \"0cf4f637-103b-4401-9d5e-1ab49201634c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-j9s6j" Apr 21 04:06:32.795806 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:32.795567 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/0cf4f637-103b-4401-9d5e-1ab49201634c-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-j9s6j\" (UID: \"0cf4f637-103b-4401-9d5e-1ab49201634c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-j9s6j" Apr 21 04:06:32.795806 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:32.795665 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/0cf4f637-103b-4401-9d5e-1ab49201634c-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-j9s6j\" (UID: \"0cf4f637-103b-4401-9d5e-1ab49201634c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-j9s6j" Apr 21 04:06:32.795806 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:32.795727 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/0cf4f637-103b-4401-9d5e-1ab49201634c-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-j9s6j\" (UID: \"0cf4f637-103b-4401-9d5e-1ab49201634c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-j9s6j" Apr 21 04:06:32.795967 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:32.795831 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/0cf4f637-103b-4401-9d5e-1ab49201634c-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-j9s6j\" (UID: \"0cf4f637-103b-4401-9d5e-1ab49201634c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-j9s6j" Apr 21 04:06:32.796106 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:32.796085 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/0cf4f637-103b-4401-9d5e-1ab49201634c-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-j9s6j\" (UID: \"0cf4f637-103b-4401-9d5e-1ab49201634c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-j9s6j" Apr 21 04:06:32.796172 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:32.796098 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/0cf4f637-103b-4401-9d5e-1ab49201634c-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-j9s6j\" (UID: \"0cf4f637-103b-4401-9d5e-1ab49201634c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-j9s6j" Apr 21 04:06:32.797672 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:32.797645 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/0cf4f637-103b-4401-9d5e-1ab49201634c-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-j9s6j\" (UID: \"0cf4f637-103b-4401-9d5e-1ab49201634c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-j9s6j" Apr 21 04:06:32.797954 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:32.797937 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/0cf4f637-103b-4401-9d5e-1ab49201634c-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-j9s6j\" (UID: \"0cf4f637-103b-4401-9d5e-1ab49201634c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-j9s6j" Apr 21 04:06:32.803754 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:32.803731 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/0cf4f637-103b-4401-9d5e-1ab49201634c-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-j9s6j\" (UID: \"0cf4f637-103b-4401-9d5e-1ab49201634c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-j9s6j" Apr 21 04:06:32.803840 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:32.803818 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-79d7q\" (UniqueName: \"kubernetes.io/projected/0cf4f637-103b-4401-9d5e-1ab49201634c-kube-api-access-79d7q\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-j9s6j\" (UID: \"0cf4f637-103b-4401-9d5e-1ab49201634c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-j9s6j" Apr 21 04:06:32.915133 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:32.915046 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-j9s6j" Apr 21 04:06:33.043801 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:33.043775 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-j9s6j"] Apr 21 04:06:33.045888 ip-10-0-134-136 kubenswrapper[2578]: W0421 04:06:33.045858 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cf4f637_103b_4401_9d5e_1ab49201634c.slice/crio-f94e17e4e26cd8b17aee9029c0cce9f4babbababa5472c20793c47c42c1d7d1b WatchSource:0}: Error finding container f94e17e4e26cd8b17aee9029c0cce9f4babbababa5472c20793c47c42c1d7d1b: Status 404 returned error can't find the container with id f94e17e4e26cd8b17aee9029c0cce9f4babbababa5472c20793c47c42c1d7d1b Apr 21 04:06:33.716588 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:33.716557 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-5s9br" Apr 21 04:06:33.761035 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:33.760997 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-j9s6j" event={"ID":"0cf4f637-103b-4401-9d5e-1ab49201634c","Type":"ContainerStarted","Data":"f94e17e4e26cd8b17aee9029c0cce9f4babbababa5472c20793c47c42c1d7d1b"} Apr 21 04:06:35.692365 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:35.692326 2578 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 21 04:06:35.692640 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:35.692400 2578 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 21 04:06:35.692640 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:35.692426 2578 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 21 04:06:35.771357 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:35.771321 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-j9s6j" event={"ID":"0cf4f637-103b-4401-9d5e-1ab49201634c","Type":"ContainerStarted","Data":"a1848d65b22fd5e090158bbaae640d47567c25a45dcbd91f8dbf0dc85f02cbe2"} Apr 21 04:06:35.793060 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:35.793000 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-j9s6j" podStartSLOduration=1.148973107 podStartE2EDuration="3.79298524s" podCreationTimestamp="2026-04-21 04:06:32 +0000 UTC" firstStartedPulling="2026-04-21 04:06:33.048046701 +0000 UTC m=+536.717485755" lastFinishedPulling="2026-04-21 04:06:35.692058826 +0000 UTC m=+539.361497888" observedRunningTime="2026-04-21 04:06:35.79022495 +0000 UTC m=+539.459664017" watchObservedRunningTime="2026-04-21 04:06:35.79298524 +0000 UTC m=+539.462424315" Apr 21 04:06:35.915224 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:35.915193 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-j9s6j" Apr 21 04:06:36.919921 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:36.919892 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-j9s6j" Apr 21 04:06:37.779117 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:37.779085 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-j9s6j" Apr 21 04:06:37.780021 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:37.780004 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-j9s6j" Apr 21 04:06:38.420554 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:38.420515 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-875d59bd6-mqc5b"] Apr 21 04:06:38.428150 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:38.428122 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-875d59bd6-mqc5b" Apr 21 04:06:38.435018 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:38.434992 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-875d59bd6-mqc5b"] Apr 21 04:06:38.560075 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:38.560045 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f3bccf2e-9c8f-49d5-a797-6235b2ebd1b3-oauth-serving-cert\") pod \"console-875d59bd6-mqc5b\" (UID: \"f3bccf2e-9c8f-49d5-a797-6235b2ebd1b3\") " pod="openshift-console/console-875d59bd6-mqc5b" Apr 21 04:06:38.560245 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:38.560096 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3bccf2e-9c8f-49d5-a797-6235b2ebd1b3-trusted-ca-bundle\") pod \"console-875d59bd6-mqc5b\" (UID: \"f3bccf2e-9c8f-49d5-a797-6235b2ebd1b3\") " pod="openshift-console/console-875d59bd6-mqc5b" Apr 21 04:06:38.560245 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:38.560128 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f3bccf2e-9c8f-49d5-a797-6235b2ebd1b3-console-serving-cert\") pod \"console-875d59bd6-mqc5b\" (UID: \"f3bccf2e-9c8f-49d5-a797-6235b2ebd1b3\") " pod="openshift-console/console-875d59bd6-mqc5b" Apr 21 04:06:38.560245 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:38.560176 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f3bccf2e-9c8f-49d5-a797-6235b2ebd1b3-console-oauth-config\") pod \"console-875d59bd6-mqc5b\" (UID: \"f3bccf2e-9c8f-49d5-a797-6235b2ebd1b3\") " pod="openshift-console/console-875d59bd6-mqc5b" Apr 21 04:06:38.560245 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:38.560192 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f3bccf2e-9c8f-49d5-a797-6235b2ebd1b3-service-ca\") pod \"console-875d59bd6-mqc5b\" (UID: \"f3bccf2e-9c8f-49d5-a797-6235b2ebd1b3\") " pod="openshift-console/console-875d59bd6-mqc5b" Apr 21 04:06:38.560245 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:38.560220 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkh6r\" (UniqueName: \"kubernetes.io/projected/f3bccf2e-9c8f-49d5-a797-6235b2ebd1b3-kube-api-access-qkh6r\") pod \"console-875d59bd6-mqc5b\" (UID: \"f3bccf2e-9c8f-49d5-a797-6235b2ebd1b3\") " pod="openshift-console/console-875d59bd6-mqc5b" Apr 21 04:06:38.560460 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:38.560260 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f3bccf2e-9c8f-49d5-a797-6235b2ebd1b3-console-config\") pod \"console-875d59bd6-mqc5b\" (UID: \"f3bccf2e-9c8f-49d5-a797-6235b2ebd1b3\") " pod="openshift-console/console-875d59bd6-mqc5b" Apr 21 04:06:38.660950 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:38.660898 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f3bccf2e-9c8f-49d5-a797-6235b2ebd1b3-oauth-serving-cert\") pod \"console-875d59bd6-mqc5b\" (UID: \"f3bccf2e-9c8f-49d5-a797-6235b2ebd1b3\") " pod="openshift-console/console-875d59bd6-mqc5b" Apr 21 04:06:38.661138 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:38.660972 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3bccf2e-9c8f-49d5-a797-6235b2ebd1b3-trusted-ca-bundle\") pod \"console-875d59bd6-mqc5b\" (UID: \"f3bccf2e-9c8f-49d5-a797-6235b2ebd1b3\") " pod="openshift-console/console-875d59bd6-mqc5b" Apr 21 04:06:38.661138 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:38.660993 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f3bccf2e-9c8f-49d5-a797-6235b2ebd1b3-console-serving-cert\") pod \"console-875d59bd6-mqc5b\" (UID: \"f3bccf2e-9c8f-49d5-a797-6235b2ebd1b3\") " pod="openshift-console/console-875d59bd6-mqc5b" Apr 21 04:06:38.661138 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:38.661020 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f3bccf2e-9c8f-49d5-a797-6235b2ebd1b3-console-oauth-config\") pod \"console-875d59bd6-mqc5b\" (UID: \"f3bccf2e-9c8f-49d5-a797-6235b2ebd1b3\") " pod="openshift-console/console-875d59bd6-mqc5b" Apr 21 04:06:38.661138 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:38.661036 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f3bccf2e-9c8f-49d5-a797-6235b2ebd1b3-service-ca\") pod \"console-875d59bd6-mqc5b\" (UID: \"f3bccf2e-9c8f-49d5-a797-6235b2ebd1b3\") " pod="openshift-console/console-875d59bd6-mqc5b" Apr 21 04:06:38.661138 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:38.661063 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qkh6r\" (UniqueName: \"kubernetes.io/projected/f3bccf2e-9c8f-49d5-a797-6235b2ebd1b3-kube-api-access-qkh6r\") pod \"console-875d59bd6-mqc5b\" (UID: \"f3bccf2e-9c8f-49d5-a797-6235b2ebd1b3\") " pod="openshift-console/console-875d59bd6-mqc5b" Apr 21 04:06:38.661427 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:38.661146 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f3bccf2e-9c8f-49d5-a797-6235b2ebd1b3-console-config\") pod \"console-875d59bd6-mqc5b\" (UID: \"f3bccf2e-9c8f-49d5-a797-6235b2ebd1b3\") " pod="openshift-console/console-875d59bd6-mqc5b" Apr 21 04:06:38.661839 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:38.661812 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f3bccf2e-9c8f-49d5-a797-6235b2ebd1b3-oauth-serving-cert\") pod \"console-875d59bd6-mqc5b\" (UID: \"f3bccf2e-9c8f-49d5-a797-6235b2ebd1b3\") " pod="openshift-console/console-875d59bd6-mqc5b" Apr 21 04:06:38.661976 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:38.661844 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f3bccf2e-9c8f-49d5-a797-6235b2ebd1b3-console-config\") pod \"console-875d59bd6-mqc5b\" (UID: \"f3bccf2e-9c8f-49d5-a797-6235b2ebd1b3\") " pod="openshift-console/console-875d59bd6-mqc5b" Apr 21 04:06:38.661976 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:38.661887 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3bccf2e-9c8f-49d5-a797-6235b2ebd1b3-trusted-ca-bundle\") pod \"console-875d59bd6-mqc5b\" (UID: \"f3bccf2e-9c8f-49d5-a797-6235b2ebd1b3\") " pod="openshift-console/console-875d59bd6-mqc5b" Apr 21 04:06:38.662063 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:38.662028 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f3bccf2e-9c8f-49d5-a797-6235b2ebd1b3-service-ca\") pod \"console-875d59bd6-mqc5b\" (UID: \"f3bccf2e-9c8f-49d5-a797-6235b2ebd1b3\") " pod="openshift-console/console-875d59bd6-mqc5b" Apr 21 04:06:38.663642 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:38.663605 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f3bccf2e-9c8f-49d5-a797-6235b2ebd1b3-console-oauth-config\") pod \"console-875d59bd6-mqc5b\" (UID: \"f3bccf2e-9c8f-49d5-a797-6235b2ebd1b3\") " pod="openshift-console/console-875d59bd6-mqc5b" Apr 21 04:06:38.663759 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:38.663688 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f3bccf2e-9c8f-49d5-a797-6235b2ebd1b3-console-serving-cert\") pod \"console-875d59bd6-mqc5b\" (UID: \"f3bccf2e-9c8f-49d5-a797-6235b2ebd1b3\") " pod="openshift-console/console-875d59bd6-mqc5b" Apr 21 04:06:38.668544 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:38.668522 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkh6r\" (UniqueName: \"kubernetes.io/projected/f3bccf2e-9c8f-49d5-a797-6235b2ebd1b3-kube-api-access-qkh6r\") pod \"console-875d59bd6-mqc5b\" (UID: \"f3bccf2e-9c8f-49d5-a797-6235b2ebd1b3\") " pod="openshift-console/console-875d59bd6-mqc5b" Apr 21 04:06:38.739449 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:38.739354 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-875d59bd6-mqc5b" Apr 21 04:06:38.861585 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:38.861561 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-875d59bd6-mqc5b"] Apr 21 04:06:38.863130 ip-10-0-134-136 kubenswrapper[2578]: W0421 04:06:38.863100 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3bccf2e_9c8f_49d5_a797_6235b2ebd1b3.slice/crio-d92e7f0011b6333e678717632537eac48329b23d7a5dff61c2afd493c2c8716b WatchSource:0}: Error finding container d92e7f0011b6333e678717632537eac48329b23d7a5dff61c2afd493c2c8716b: Status 404 returned error can't find the container with id d92e7f0011b6333e678717632537eac48329b23d7a5dff61c2afd493c2c8716b Apr 21 04:06:39.787852 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:39.787809 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-875d59bd6-mqc5b" event={"ID":"f3bccf2e-9c8f-49d5-a797-6235b2ebd1b3","Type":"ContainerStarted","Data":"2b5b651c0428b23faf10e8723cb393425febcc92aa7af36acc515d467ed9bce6"} Apr 21 04:06:39.787852 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:39.787856 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-875d59bd6-mqc5b" event={"ID":"f3bccf2e-9c8f-49d5-a797-6235b2ebd1b3","Type":"ContainerStarted","Data":"d92e7f0011b6333e678717632537eac48329b23d7a5dff61c2afd493c2c8716b"} Apr 21 04:06:39.810137 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:39.810082 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-875d59bd6-mqc5b" podStartSLOduration=1.810064192 podStartE2EDuration="1.810064192s" podCreationTimestamp="2026-04-21 04:06:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:06:39.808400313 +0000 UTC m=+543.477839388" watchObservedRunningTime="2026-04-21 04:06:39.810064192 +0000 UTC m=+543.479503267" Apr 21 04:06:40.752925 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:40.752886 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bg5bwh"] Apr 21 04:06:40.756900 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:40.756882 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bg5bwh" Apr 21 04:06:40.759389 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:40.759363 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 04:06:40.760460 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:40.760438 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 04:06:40.760598 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:40.760481 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-k6n2p\"" Apr 21 04:06:40.763193 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:40.763164 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bg5bwh"] Apr 21 04:06:40.857617 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:40.857583 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pp9jf"] Apr 21 04:06:40.861254 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:40.861234 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pp9jf" Apr 21 04:06:40.868644 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:40.868610 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pp9jf"] Apr 21 04:06:40.882560 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:40.882512 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhkwb\" (UniqueName: \"kubernetes.io/projected/d29cdec6-bc44-4468-be4a-6696bb7bc4a9-kube-api-access-bhkwb\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bg5bwh\" (UID: \"d29cdec6-bc44-4468-be4a-6696bb7bc4a9\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bg5bwh" Apr 21 04:06:40.882760 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:40.882672 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d29cdec6-bc44-4468-be4a-6696bb7bc4a9-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bg5bwh\" (UID: \"d29cdec6-bc44-4468-be4a-6696bb7bc4a9\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bg5bwh" Apr 21 04:06:40.882760 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:40.882728 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d29cdec6-bc44-4468-be4a-6696bb7bc4a9-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bg5bwh\" (UID: \"d29cdec6-bc44-4468-be4a-6696bb7bc4a9\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bg5bwh" Apr 21 04:06:40.953360 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:40.953324 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zqzmn"] Apr 21 04:06:40.957028 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:40.957011 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zqzmn" Apr 21 04:06:40.963754 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:40.963727 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zqzmn"] Apr 21 04:06:40.984435 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:40.984401 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqdxv\" (UniqueName: \"kubernetes.io/projected/06aa4f62-8552-4318-9c55-d065c4f6ca0d-kube-api-access-jqdxv\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pp9jf\" (UID: \"06aa4f62-8552-4318-9c55-d065c4f6ca0d\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pp9jf" Apr 21 04:06:40.984435 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:40.984440 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/06aa4f62-8552-4318-9c55-d065c4f6ca0d-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pp9jf\" (UID: \"06aa4f62-8552-4318-9c55-d065c4f6ca0d\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pp9jf" Apr 21 04:06:40.984679 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:40.984467 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bhkwb\" (UniqueName: \"kubernetes.io/projected/d29cdec6-bc44-4468-be4a-6696bb7bc4a9-kube-api-access-bhkwb\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bg5bwh\" (UID: \"d29cdec6-bc44-4468-be4a-6696bb7bc4a9\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bg5bwh" Apr 21 04:06:40.984679 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:40.984544 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d29cdec6-bc44-4468-be4a-6696bb7bc4a9-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bg5bwh\" (UID: \"d29cdec6-bc44-4468-be4a-6696bb7bc4a9\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bg5bwh" Apr 21 04:06:40.984679 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:40.984589 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d29cdec6-bc44-4468-be4a-6696bb7bc4a9-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bg5bwh\" (UID: \"d29cdec6-bc44-4468-be4a-6696bb7bc4a9\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bg5bwh" Apr 21 04:06:40.984679 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:40.984621 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/06aa4f62-8552-4318-9c55-d065c4f6ca0d-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pp9jf\" (UID: \"06aa4f62-8552-4318-9c55-d065c4f6ca0d\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pp9jf" Apr 21 04:06:40.984925 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:40.984909 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d29cdec6-bc44-4468-be4a-6696bb7bc4a9-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bg5bwh\" (UID: \"d29cdec6-bc44-4468-be4a-6696bb7bc4a9\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bg5bwh" Apr 21 04:06:40.984962 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:40.984938 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d29cdec6-bc44-4468-be4a-6696bb7bc4a9-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bg5bwh\" (UID: \"d29cdec6-bc44-4468-be4a-6696bb7bc4a9\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bg5bwh" Apr 21 04:06:40.992487 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:40.992463 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhkwb\" (UniqueName: \"kubernetes.io/projected/d29cdec6-bc44-4468-be4a-6696bb7bc4a9-kube-api-access-bhkwb\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bg5bwh\" (UID: \"d29cdec6-bc44-4468-be4a-6696bb7bc4a9\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bg5bwh" Apr 21 04:06:41.058594 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:41.058553 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bxzp8"] Apr 21 04:06:41.062361 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:41.062340 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bxzp8" Apr 21 04:06:41.067490 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:41.067465 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bg5bwh" Apr 21 04:06:41.069566 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:41.069543 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bxzp8"] Apr 21 04:06:41.085642 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:41.085590 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jqdxv\" (UniqueName: \"kubernetes.io/projected/06aa4f62-8552-4318-9c55-d065c4f6ca0d-kube-api-access-jqdxv\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pp9jf\" (UID: \"06aa4f62-8552-4318-9c55-d065c4f6ca0d\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pp9jf" Apr 21 04:06:41.085789 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:41.085643 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/06aa4f62-8552-4318-9c55-d065c4f6ca0d-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pp9jf\" (UID: \"06aa4f62-8552-4318-9c55-d065c4f6ca0d\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pp9jf" Apr 21 04:06:41.085789 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:41.085720 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/06aa4f62-8552-4318-9c55-d065c4f6ca0d-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pp9jf\" (UID: \"06aa4f62-8552-4318-9c55-d065c4f6ca0d\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pp9jf" Apr 21 04:06:41.085789 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:41.085776 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ksbn\" (UniqueName: \"kubernetes.io/projected/3d72c2a9-f78a-41a6-b215-769fc412ff03-kube-api-access-2ksbn\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zqzmn\" (UID: \"3d72c2a9-f78a-41a6-b215-769fc412ff03\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zqzmn" Apr 21 04:06:41.085977 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:41.085805 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3d72c2a9-f78a-41a6-b215-769fc412ff03-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zqzmn\" (UID: \"3d72c2a9-f78a-41a6-b215-769fc412ff03\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zqzmn" Apr 21 04:06:41.085977 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:41.085838 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3d72c2a9-f78a-41a6-b215-769fc412ff03-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zqzmn\" (UID: \"3d72c2a9-f78a-41a6-b215-769fc412ff03\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zqzmn" Apr 21 04:06:41.086274 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:41.086251 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/06aa4f62-8552-4318-9c55-d065c4f6ca0d-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pp9jf\" (UID: \"06aa4f62-8552-4318-9c55-d065c4f6ca0d\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pp9jf" Apr 21 04:06:41.086401 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:41.086274 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/06aa4f62-8552-4318-9c55-d065c4f6ca0d-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pp9jf\" (UID: \"06aa4f62-8552-4318-9c55-d065c4f6ca0d\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pp9jf" Apr 21 04:06:41.093351 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:41.093325 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqdxv\" (UniqueName: \"kubernetes.io/projected/06aa4f62-8552-4318-9c55-d065c4f6ca0d-kube-api-access-jqdxv\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pp9jf\" (UID: \"06aa4f62-8552-4318-9c55-d065c4f6ca0d\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pp9jf" Apr 21 04:06:41.171749 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:41.171716 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pp9jf" Apr 21 04:06:41.187016 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:41.186990 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99eb7f21-72ec-4d9b-9ab8-b800ec9fd483-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bxzp8\" (UID: \"99eb7f21-72ec-4d9b-9ab8-b800ec9fd483\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bxzp8" Apr 21 04:06:41.187170 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:41.187028 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2wzp\" (UniqueName: \"kubernetes.io/projected/99eb7f21-72ec-4d9b-9ab8-b800ec9fd483-kube-api-access-n2wzp\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bxzp8\" (UID: \"99eb7f21-72ec-4d9b-9ab8-b800ec9fd483\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bxzp8" Apr 21 04:06:41.187170 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:41.187139 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2ksbn\" (UniqueName: \"kubernetes.io/projected/3d72c2a9-f78a-41a6-b215-769fc412ff03-kube-api-access-2ksbn\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zqzmn\" (UID: \"3d72c2a9-f78a-41a6-b215-769fc412ff03\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zqzmn" Apr 21 04:06:41.187311 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:41.187179 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3d72c2a9-f78a-41a6-b215-769fc412ff03-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zqzmn\" (UID: \"3d72c2a9-f78a-41a6-b215-769fc412ff03\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zqzmn" Apr 21 04:06:41.187311 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:41.187216 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3d72c2a9-f78a-41a6-b215-769fc412ff03-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zqzmn\" (UID: \"3d72c2a9-f78a-41a6-b215-769fc412ff03\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zqzmn" Apr 21 04:06:41.187311 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:41.187247 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99eb7f21-72ec-4d9b-9ab8-b800ec9fd483-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bxzp8\" (UID: \"99eb7f21-72ec-4d9b-9ab8-b800ec9fd483\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bxzp8" Apr 21 04:06:41.187619 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:41.187583 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3d72c2a9-f78a-41a6-b215-769fc412ff03-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zqzmn\" (UID: \"3d72c2a9-f78a-41a6-b215-769fc412ff03\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zqzmn" Apr 21 04:06:41.187711 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:41.187626 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3d72c2a9-f78a-41a6-b215-769fc412ff03-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zqzmn\" (UID: \"3d72c2a9-f78a-41a6-b215-769fc412ff03\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zqzmn" Apr 21 04:06:41.194707 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:41.194682 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bg5bwh"] Apr 21 04:06:41.195645 ip-10-0-134-136 kubenswrapper[2578]: W0421 04:06:41.195624 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd29cdec6_bc44_4468_be4a_6696bb7bc4a9.slice/crio-cf928a0e5f4a44a5debcf4c169b40d80621fa0d9db7fc69da4d58e87c7658162 WatchSource:0}: Error finding container cf928a0e5f4a44a5debcf4c169b40d80621fa0d9db7fc69da4d58e87c7658162: Status 404 returned error can't find the container with id cf928a0e5f4a44a5debcf4c169b40d80621fa0d9db7fc69da4d58e87c7658162 Apr 21 04:06:41.195868 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:41.195843 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ksbn\" (UniqueName: \"kubernetes.io/projected/3d72c2a9-f78a-41a6-b215-769fc412ff03-kube-api-access-2ksbn\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zqzmn\" (UID: \"3d72c2a9-f78a-41a6-b215-769fc412ff03\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zqzmn" Apr 21 04:06:41.267959 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:41.267931 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zqzmn" Apr 21 04:06:41.289161 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:41.288525 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99eb7f21-72ec-4d9b-9ab8-b800ec9fd483-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bxzp8\" (UID: \"99eb7f21-72ec-4d9b-9ab8-b800ec9fd483\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bxzp8" Apr 21 04:06:41.289161 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:41.288680 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99eb7f21-72ec-4d9b-9ab8-b800ec9fd483-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bxzp8\" (UID: \"99eb7f21-72ec-4d9b-9ab8-b800ec9fd483\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bxzp8" Apr 21 04:06:41.289161 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:41.288726 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n2wzp\" (UniqueName: \"kubernetes.io/projected/99eb7f21-72ec-4d9b-9ab8-b800ec9fd483-kube-api-access-n2wzp\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bxzp8\" (UID: \"99eb7f21-72ec-4d9b-9ab8-b800ec9fd483\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bxzp8" Apr 21 04:06:41.291602 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:41.291000 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99eb7f21-72ec-4d9b-9ab8-b800ec9fd483-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bxzp8\" (UID: \"99eb7f21-72ec-4d9b-9ab8-b800ec9fd483\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bxzp8" Apr 21 04:06:41.291602 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:41.291258 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99eb7f21-72ec-4d9b-9ab8-b800ec9fd483-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bxzp8\" (UID: \"99eb7f21-72ec-4d9b-9ab8-b800ec9fd483\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bxzp8" Apr 21 04:06:41.296860 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:41.296831 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2wzp\" (UniqueName: \"kubernetes.io/projected/99eb7f21-72ec-4d9b-9ab8-b800ec9fd483-kube-api-access-n2wzp\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bxzp8\" (UID: \"99eb7f21-72ec-4d9b-9ab8-b800ec9fd483\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bxzp8" Apr 21 04:06:41.302230 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:41.302201 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pp9jf"] Apr 21 04:06:41.304044 ip-10-0-134-136 kubenswrapper[2578]: W0421 04:06:41.304020 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06aa4f62_8552_4318_9c55_d065c4f6ca0d.slice/crio-51d8431f368cdd9bad764503478d97ad4d0b8d5b64801439c7411e407290fe74 WatchSource:0}: Error finding container 51d8431f368cdd9bad764503478d97ad4d0b8d5b64801439c7411e407290fe74: Status 404 returned error can't find the container with id 51d8431f368cdd9bad764503478d97ad4d0b8d5b64801439c7411e407290fe74 Apr 21 04:06:41.375091 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:41.375057 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bxzp8" Apr 21 04:06:41.419156 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:41.419119 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zqzmn"] Apr 21 04:06:41.511921 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:41.511892 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bxzp8"] Apr 21 04:06:41.530754 ip-10-0-134-136 kubenswrapper[2578]: W0421 04:06:41.530715 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99eb7f21_72ec_4d9b_9ab8_b800ec9fd483.slice/crio-f13a12ca0a383c8f25b1dba2a6021b0a331cdbac8b503f8a4572106cabbf38a9 WatchSource:0}: Error finding container f13a12ca0a383c8f25b1dba2a6021b0a331cdbac8b503f8a4572106cabbf38a9: Status 404 returned error can't find the container with id f13a12ca0a383c8f25b1dba2a6021b0a331cdbac8b503f8a4572106cabbf38a9 Apr 21 04:06:41.796437 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:41.796403 2578 generic.go:358] "Generic (PLEG): container finished" podID="99eb7f21-72ec-4d9b-9ab8-b800ec9fd483" containerID="059418dda493738dc693d38aa4f66fb1d2e2f847ddbb2642bd6eeb7a1d46433c" exitCode=0 Apr 21 04:06:41.796599 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:41.796478 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bxzp8" event={"ID":"99eb7f21-72ec-4d9b-9ab8-b800ec9fd483","Type":"ContainerDied","Data":"059418dda493738dc693d38aa4f66fb1d2e2f847ddbb2642bd6eeb7a1d46433c"} Apr 21 04:06:41.796599 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:41.796509 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bxzp8" event={"ID":"99eb7f21-72ec-4d9b-9ab8-b800ec9fd483","Type":"ContainerStarted","Data":"f13a12ca0a383c8f25b1dba2a6021b0a331cdbac8b503f8a4572106cabbf38a9"} Apr 21 04:06:41.797711 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:41.797689 2578 generic.go:358] "Generic (PLEG): container finished" podID="06aa4f62-8552-4318-9c55-d065c4f6ca0d" containerID="a4d047937949c6f4c5ebd039c82d05243e190b80e04a4843cdf7fe55f3b3ee95" exitCode=0 Apr 21 04:06:41.797807 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:41.797761 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pp9jf" event={"ID":"06aa4f62-8552-4318-9c55-d065c4f6ca0d","Type":"ContainerDied","Data":"a4d047937949c6f4c5ebd039c82d05243e190b80e04a4843cdf7fe55f3b3ee95"} Apr 21 04:06:41.797909 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:41.797803 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pp9jf" event={"ID":"06aa4f62-8552-4318-9c55-d065c4f6ca0d","Type":"ContainerStarted","Data":"51d8431f368cdd9bad764503478d97ad4d0b8d5b64801439c7411e407290fe74"} Apr 21 04:06:41.799162 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:41.799145 2578 generic.go:358] "Generic (PLEG): container finished" podID="d29cdec6-bc44-4468-be4a-6696bb7bc4a9" containerID="543f5cdb92b6e7e5d161fdf1c4ecfc6487b7318651025387296ea95f1ef46242" exitCode=0 Apr 21 04:06:41.799241 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:41.799224 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bg5bwh" event={"ID":"d29cdec6-bc44-4468-be4a-6696bb7bc4a9","Type":"ContainerDied","Data":"543f5cdb92b6e7e5d161fdf1c4ecfc6487b7318651025387296ea95f1ef46242"} Apr 21 04:06:41.799315 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:41.799250 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bg5bwh" event={"ID":"d29cdec6-bc44-4468-be4a-6696bb7bc4a9","Type":"ContainerStarted","Data":"cf928a0e5f4a44a5debcf4c169b40d80621fa0d9db7fc69da4d58e87c7658162"} Apr 21 04:06:41.800786 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:41.800765 2578 generic.go:358] "Generic (PLEG): container finished" podID="3d72c2a9-f78a-41a6-b215-769fc412ff03" containerID="ad796f084edb6f8181e9db4893b237d022ca65c2d418450384351f05c37c8673" exitCode=0 Apr 21 04:06:41.800858 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:41.800800 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zqzmn" event={"ID":"3d72c2a9-f78a-41a6-b215-769fc412ff03","Type":"ContainerDied","Data":"ad796f084edb6f8181e9db4893b237d022ca65c2d418450384351f05c37c8673"} Apr 21 04:06:41.800858 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:41.800824 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zqzmn" event={"ID":"3d72c2a9-f78a-41a6-b215-769fc412ff03","Type":"ContainerStarted","Data":"c2084b3afb5878dd0a3dd45c7747ec9c736420423409bc107d0469242b45fb94"} Apr 21 04:06:43.809724 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:43.809631 2578 generic.go:358] "Generic (PLEG): container finished" podID="06aa4f62-8552-4318-9c55-d065c4f6ca0d" containerID="72b7cc88664db25dace2e20302ad8f7f072755834866919984286d90b711c9bc" exitCode=0 Apr 21 04:06:43.809724 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:43.809697 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pp9jf" event={"ID":"06aa4f62-8552-4318-9c55-d065c4f6ca0d","Type":"ContainerDied","Data":"72b7cc88664db25dace2e20302ad8f7f072755834866919984286d90b711c9bc"} Apr 21 04:06:43.811571 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:43.811546 2578 generic.go:358] "Generic (PLEG): container finished" podID="d29cdec6-bc44-4468-be4a-6696bb7bc4a9" containerID="a35167f431efb01a9ef1a2d79f91770d07cdcb92eaa06c385e255add05f39308" exitCode=0 Apr 21 04:06:43.811651 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:43.811629 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bg5bwh" event={"ID":"d29cdec6-bc44-4468-be4a-6696bb7bc4a9","Type":"ContainerDied","Data":"a35167f431efb01a9ef1a2d79f91770d07cdcb92eaa06c385e255add05f39308"} Apr 21 04:06:43.813430 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:43.813410 2578 generic.go:358] "Generic (PLEG): container finished" podID="3d72c2a9-f78a-41a6-b215-769fc412ff03" containerID="2f3de861aff7bd541be47642842ff883556a2d2b24d103ea015c17c5919f1d2c" exitCode=0 Apr 21 04:06:43.813521 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:43.813491 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zqzmn" event={"ID":"3d72c2a9-f78a-41a6-b215-769fc412ff03","Type":"ContainerDied","Data":"2f3de861aff7bd541be47642842ff883556a2d2b24d103ea015c17c5919f1d2c"} Apr 21 04:06:43.815206 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:43.815129 2578 generic.go:358] "Generic (PLEG): container finished" podID="99eb7f21-72ec-4d9b-9ab8-b800ec9fd483" containerID="7502a9dddf7199ff5f477c0fd13049bc7e1ac363880978247c4ebd019880e12c" exitCode=0 Apr 21 04:06:43.815206 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:43.815156 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bxzp8" event={"ID":"99eb7f21-72ec-4d9b-9ab8-b800ec9fd483","Type":"ContainerDied","Data":"7502a9dddf7199ff5f477c0fd13049bc7e1ac363880978247c4ebd019880e12c"} Apr 21 04:06:44.826891 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:44.826854 2578 generic.go:358] "Generic (PLEG): container finished" podID="99eb7f21-72ec-4d9b-9ab8-b800ec9fd483" containerID="3505c02334d567fb27bea01deb5a72232914c4c43377b604132b282548cd0a16" exitCode=0 Apr 21 04:06:44.827336 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:44.826932 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bxzp8" event={"ID":"99eb7f21-72ec-4d9b-9ab8-b800ec9fd483","Type":"ContainerDied","Data":"3505c02334d567fb27bea01deb5a72232914c4c43377b604132b282548cd0a16"} Apr 21 04:06:44.828685 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:44.828662 2578 generic.go:358] "Generic (PLEG): container finished" podID="06aa4f62-8552-4318-9c55-d065c4f6ca0d" containerID="954c3c3b860ca8b34dec96904f73f056073009fb8ae71b1406ed55de32e76db3" exitCode=0 Apr 21 04:06:44.828795 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:44.828722 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pp9jf" event={"ID":"06aa4f62-8552-4318-9c55-d065c4f6ca0d","Type":"ContainerDied","Data":"954c3c3b860ca8b34dec96904f73f056073009fb8ae71b1406ed55de32e76db3"} Apr 21 04:06:44.830404 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:44.830375 2578 generic.go:358] "Generic (PLEG): container finished" podID="d29cdec6-bc44-4468-be4a-6696bb7bc4a9" containerID="635b88d4ff0dd3b3dab64ed3aa5dd24205913f32992d8f5d0c7b6b7a54055e36" exitCode=0 Apr 21 04:06:44.830511 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:44.830462 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bg5bwh" event={"ID":"d29cdec6-bc44-4468-be4a-6696bb7bc4a9","Type":"ContainerDied","Data":"635b88d4ff0dd3b3dab64ed3aa5dd24205913f32992d8f5d0c7b6b7a54055e36"} Apr 21 04:06:44.832222 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:44.832201 2578 generic.go:358] "Generic (PLEG): container finished" podID="3d72c2a9-f78a-41a6-b215-769fc412ff03" containerID="1a0718a839a789b2cfaf62c9bdf36f59c7168afc24ccd463584548ad71ccdd24" exitCode=0 Apr 21 04:06:44.832310 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:44.832244 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zqzmn" event={"ID":"3d72c2a9-f78a-41a6-b215-769fc412ff03","Type":"ContainerDied","Data":"1a0718a839a789b2cfaf62c9bdf36f59c7168afc24ccd463584548ad71ccdd24"} Apr 21 04:06:45.996267 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:45.996234 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bxzp8" Apr 21 04:06:46.049909 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:46.049886 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pp9jf" Apr 21 04:06:46.057014 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:46.056995 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zqzmn" Apr 21 04:06:46.059312 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:46.059266 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bg5bwh" Apr 21 04:06:46.138190 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:46.138094 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99eb7f21-72ec-4d9b-9ab8-b800ec9fd483-util\") pod \"99eb7f21-72ec-4d9b-9ab8-b800ec9fd483\" (UID: \"99eb7f21-72ec-4d9b-9ab8-b800ec9fd483\") " Apr 21 04:06:46.138395 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:46.138267 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2wzp\" (UniqueName: \"kubernetes.io/projected/99eb7f21-72ec-4d9b-9ab8-b800ec9fd483-kube-api-access-n2wzp\") pod \"99eb7f21-72ec-4d9b-9ab8-b800ec9fd483\" (UID: \"99eb7f21-72ec-4d9b-9ab8-b800ec9fd483\") " Apr 21 04:06:46.138395 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:46.138344 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99eb7f21-72ec-4d9b-9ab8-b800ec9fd483-bundle\") pod \"99eb7f21-72ec-4d9b-9ab8-b800ec9fd483\" (UID: \"99eb7f21-72ec-4d9b-9ab8-b800ec9fd483\") " Apr 21 04:06:46.138802 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:46.138779 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99eb7f21-72ec-4d9b-9ab8-b800ec9fd483-bundle" (OuterVolumeSpecName: "bundle") pod "99eb7f21-72ec-4d9b-9ab8-b800ec9fd483" (UID: "99eb7f21-72ec-4d9b-9ab8-b800ec9fd483"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:06:46.140257 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:46.140237 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99eb7f21-72ec-4d9b-9ab8-b800ec9fd483-kube-api-access-n2wzp" (OuterVolumeSpecName: "kube-api-access-n2wzp") pod "99eb7f21-72ec-4d9b-9ab8-b800ec9fd483" (UID: "99eb7f21-72ec-4d9b-9ab8-b800ec9fd483"). InnerVolumeSpecName "kube-api-access-n2wzp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:06:46.143555 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:46.143533 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99eb7f21-72ec-4d9b-9ab8-b800ec9fd483-util" (OuterVolumeSpecName: "util") pod "99eb7f21-72ec-4d9b-9ab8-b800ec9fd483" (UID: "99eb7f21-72ec-4d9b-9ab8-b800ec9fd483"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:06:46.238878 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:46.238831 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d29cdec6-bc44-4468-be4a-6696bb7bc4a9-util\") pod \"d29cdec6-bc44-4468-be4a-6696bb7bc4a9\" (UID: \"d29cdec6-bc44-4468-be4a-6696bb7bc4a9\") " Apr 21 04:06:46.238878 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:46.238887 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3d72c2a9-f78a-41a6-b215-769fc412ff03-util\") pod \"3d72c2a9-f78a-41a6-b215-769fc412ff03\" (UID: \"3d72c2a9-f78a-41a6-b215-769fc412ff03\") " Apr 21 04:06:46.239062 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:46.238936 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqdxv\" (UniqueName: \"kubernetes.io/projected/06aa4f62-8552-4318-9c55-d065c4f6ca0d-kube-api-access-jqdxv\") pod \"06aa4f62-8552-4318-9c55-d065c4f6ca0d\" (UID: \"06aa4f62-8552-4318-9c55-d065c4f6ca0d\") " Apr 21 04:06:46.239062 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:46.238978 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/06aa4f62-8552-4318-9c55-d065c4f6ca0d-util\") pod \"06aa4f62-8552-4318-9c55-d065c4f6ca0d\" (UID: \"06aa4f62-8552-4318-9c55-d065c4f6ca0d\") " Apr 21 04:06:46.239062 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:46.239014 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhkwb\" (UniqueName: \"kubernetes.io/projected/d29cdec6-bc44-4468-be4a-6696bb7bc4a9-kube-api-access-bhkwb\") pod \"d29cdec6-bc44-4468-be4a-6696bb7bc4a9\" (UID: \"d29cdec6-bc44-4468-be4a-6696bb7bc4a9\") " Apr 21 04:06:46.239062 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:46.239040 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d29cdec6-bc44-4468-be4a-6696bb7bc4a9-bundle\") pod \"d29cdec6-bc44-4468-be4a-6696bb7bc4a9\" (UID: \"d29cdec6-bc44-4468-be4a-6696bb7bc4a9\") " Apr 21 04:06:46.239256 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:46.239072 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ksbn\" (UniqueName: \"kubernetes.io/projected/3d72c2a9-f78a-41a6-b215-769fc412ff03-kube-api-access-2ksbn\") pod \"3d72c2a9-f78a-41a6-b215-769fc412ff03\" (UID: \"3d72c2a9-f78a-41a6-b215-769fc412ff03\") " Apr 21 04:06:46.239256 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:46.239114 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3d72c2a9-f78a-41a6-b215-769fc412ff03-bundle\") pod \"3d72c2a9-f78a-41a6-b215-769fc412ff03\" (UID: \"3d72c2a9-f78a-41a6-b215-769fc412ff03\") " Apr 21 04:06:46.239256 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:46.239193 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/06aa4f62-8552-4318-9c55-d065c4f6ca0d-bundle\") pod \"06aa4f62-8552-4318-9c55-d065c4f6ca0d\" (UID: \"06aa4f62-8552-4318-9c55-d065c4f6ca0d\") " Apr 21 04:06:46.240841 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:46.239536 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n2wzp\" (UniqueName: \"kubernetes.io/projected/99eb7f21-72ec-4d9b-9ab8-b800ec9fd483-kube-api-access-n2wzp\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:06:46.240841 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:46.239559 2578 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99eb7f21-72ec-4d9b-9ab8-b800ec9fd483-bundle\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:06:46.240841 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:46.239573 2578 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99eb7f21-72ec-4d9b-9ab8-b800ec9fd483-util\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:06:46.240841 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:46.240098 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d29cdec6-bc44-4468-be4a-6696bb7bc4a9-bundle" (OuterVolumeSpecName: "bundle") pod "d29cdec6-bc44-4468-be4a-6696bb7bc4a9" (UID: "d29cdec6-bc44-4468-be4a-6696bb7bc4a9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:06:46.240841 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:46.240191 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06aa4f62-8552-4318-9c55-d065c4f6ca0d-bundle" (OuterVolumeSpecName: "bundle") pod "06aa4f62-8552-4318-9c55-d065c4f6ca0d" (UID: "06aa4f62-8552-4318-9c55-d065c4f6ca0d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:06:46.240841 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:46.240804 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d72c2a9-f78a-41a6-b215-769fc412ff03-bundle" (OuterVolumeSpecName: "bundle") pod "3d72c2a9-f78a-41a6-b215-769fc412ff03" (UID: "3d72c2a9-f78a-41a6-b215-769fc412ff03"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:06:46.241593 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:46.241565 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06aa4f62-8552-4318-9c55-d065c4f6ca0d-kube-api-access-jqdxv" (OuterVolumeSpecName: "kube-api-access-jqdxv") pod "06aa4f62-8552-4318-9c55-d065c4f6ca0d" (UID: "06aa4f62-8552-4318-9c55-d065c4f6ca0d"). InnerVolumeSpecName "kube-api-access-jqdxv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:06:46.241711 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:46.241592 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d29cdec6-bc44-4468-be4a-6696bb7bc4a9-kube-api-access-bhkwb" (OuterVolumeSpecName: "kube-api-access-bhkwb") pod "d29cdec6-bc44-4468-be4a-6696bb7bc4a9" (UID: "d29cdec6-bc44-4468-be4a-6696bb7bc4a9"). InnerVolumeSpecName "kube-api-access-bhkwb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:06:46.241711 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:46.241612 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d72c2a9-f78a-41a6-b215-769fc412ff03-kube-api-access-2ksbn" (OuterVolumeSpecName: "kube-api-access-2ksbn") pod "3d72c2a9-f78a-41a6-b215-769fc412ff03" (UID: "3d72c2a9-f78a-41a6-b215-769fc412ff03"). InnerVolumeSpecName "kube-api-access-2ksbn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:06:46.245459 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:46.245419 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d72c2a9-f78a-41a6-b215-769fc412ff03-util" (OuterVolumeSpecName: "util") pod "3d72c2a9-f78a-41a6-b215-769fc412ff03" (UID: "3d72c2a9-f78a-41a6-b215-769fc412ff03"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:06:46.245545 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:46.245490 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06aa4f62-8552-4318-9c55-d065c4f6ca0d-util" (OuterVolumeSpecName: "util") pod "06aa4f62-8552-4318-9c55-d065c4f6ca0d" (UID: "06aa4f62-8552-4318-9c55-d065c4f6ca0d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:06:46.245633 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:46.245616 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d29cdec6-bc44-4468-be4a-6696bb7bc4a9-util" (OuterVolumeSpecName: "util") pod "d29cdec6-bc44-4468-be4a-6696bb7bc4a9" (UID: "d29cdec6-bc44-4468-be4a-6696bb7bc4a9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:06:46.340569 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:46.340516 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jqdxv\" (UniqueName: \"kubernetes.io/projected/06aa4f62-8552-4318-9c55-d065c4f6ca0d-kube-api-access-jqdxv\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:06:46.340569 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:46.340565 2578 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/06aa4f62-8552-4318-9c55-d065c4f6ca0d-util\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:06:46.340569 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:46.340575 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bhkwb\" (UniqueName: \"kubernetes.io/projected/d29cdec6-bc44-4468-be4a-6696bb7bc4a9-kube-api-access-bhkwb\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:06:46.340569 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:46.340585 2578 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d29cdec6-bc44-4468-be4a-6696bb7bc4a9-bundle\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:06:46.340843 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:46.340594 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2ksbn\" (UniqueName: \"kubernetes.io/projected/3d72c2a9-f78a-41a6-b215-769fc412ff03-kube-api-access-2ksbn\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:06:46.340843 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:46.340602 2578 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3d72c2a9-f78a-41a6-b215-769fc412ff03-bundle\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:06:46.340843 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:46.340612 2578 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/06aa4f62-8552-4318-9c55-d065c4f6ca0d-bundle\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:06:46.340843 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:46.340620 2578 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d29cdec6-bc44-4468-be4a-6696bb7bc4a9-util\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:06:46.340843 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:46.340629 2578 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3d72c2a9-f78a-41a6-b215-769fc412ff03-util\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:06:46.842145 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:46.842116 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zqzmn" Apr 21 04:06:46.842351 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:46.842112 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zqzmn" event={"ID":"3d72c2a9-f78a-41a6-b215-769fc412ff03","Type":"ContainerDied","Data":"c2084b3afb5878dd0a3dd45c7747ec9c736420423409bc107d0469242b45fb94"} Apr 21 04:06:46.842351 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:46.842210 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2084b3afb5878dd0a3dd45c7747ec9c736420423409bc107d0469242b45fb94" Apr 21 04:06:46.843885 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:46.843858 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bxzp8" event={"ID":"99eb7f21-72ec-4d9b-9ab8-b800ec9fd483","Type":"ContainerDied","Data":"f13a12ca0a383c8f25b1dba2a6021b0a331cdbac8b503f8a4572106cabbf38a9"} Apr 21 04:06:46.843885 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:46.843888 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f13a12ca0a383c8f25b1dba2a6021b0a331cdbac8b503f8a4572106cabbf38a9" Apr 21 04:06:46.844096 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:46.843892 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88bxzp8" Apr 21 04:06:46.845733 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:46.845701 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pp9jf" event={"ID":"06aa4f62-8552-4318-9c55-d065c4f6ca0d","Type":"ContainerDied","Data":"51d8431f368cdd9bad764503478d97ad4d0b8d5b64801439c7411e407290fe74"} Apr 21 04:06:46.845733 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:46.845736 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503pp9jf" Apr 21 04:06:46.845915 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:46.845736 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51d8431f368cdd9bad764503478d97ad4d0b8d5b64801439c7411e407290fe74" Apr 21 04:06:46.847565 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:46.847535 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bg5bwh" event={"ID":"d29cdec6-bc44-4468-be4a-6696bb7bc4a9","Type":"ContainerDied","Data":"cf928a0e5f4a44a5debcf4c169b40d80621fa0d9db7fc69da4d58e87c7658162"} Apr 21 04:06:46.847703 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:46.847569 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf928a0e5f4a44a5debcf4c169b40d80621fa0d9db7fc69da4d58e87c7658162" Apr 21 04:06:46.847703 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:46.847597 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bg5bwh" Apr 21 04:06:48.739922 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:48.739884 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-875d59bd6-mqc5b" Apr 21 04:06:48.740318 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:48.739935 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-875d59bd6-mqc5b" Apr 21 04:06:48.744779 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:48.744752 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-875d59bd6-mqc5b" Apr 21 04:06:48.864998 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:48.864969 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-875d59bd6-mqc5b" Apr 21 04:06:48.910201 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:48.910172 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5d49d4b6dc-66ft2"] Apr 21 04:06:56.348640 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:56.348604 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-pxrf7"] Apr 21 04:06:56.349229 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:56.349163 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="06aa4f62-8552-4318-9c55-d065c4f6ca0d" containerName="pull" Apr 21 04:06:56.349229 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:56.349186 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="06aa4f62-8552-4318-9c55-d065c4f6ca0d" containerName="pull" Apr 21 04:06:56.349229 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:56.349199 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3d72c2a9-f78a-41a6-b215-769fc412ff03" containerName="extract" Apr 21 04:06:56.349229 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:56.349208 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d72c2a9-f78a-41a6-b215-769fc412ff03" containerName="extract" Apr 21 04:06:56.349229 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:56.349219 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="06aa4f62-8552-4318-9c55-d065c4f6ca0d" containerName="util" Apr 21 04:06:56.349229 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:56.349229 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="06aa4f62-8552-4318-9c55-d065c4f6ca0d" containerName="util" Apr 21 04:06:56.349576 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:56.349238 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3d72c2a9-f78a-41a6-b215-769fc412ff03" containerName="util" Apr 21 04:06:56.349576 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:56.349248 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d72c2a9-f78a-41a6-b215-769fc412ff03" containerName="util" Apr 21 04:06:56.349576 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:56.349256 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="06aa4f62-8552-4318-9c55-d065c4f6ca0d" containerName="extract" Apr 21 04:06:56.349576 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:56.349263 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="06aa4f62-8552-4318-9c55-d065c4f6ca0d" containerName="extract" Apr 21 04:06:56.349576 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:56.349273 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99eb7f21-72ec-4d9b-9ab8-b800ec9fd483" containerName="extract" Apr 21 04:06:56.349576 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:56.349299 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="99eb7f21-72ec-4d9b-9ab8-b800ec9fd483" containerName="extract" Apr 21 04:06:56.349576 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:56.349317 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99eb7f21-72ec-4d9b-9ab8-b800ec9fd483" containerName="util" Apr 21 04:06:56.349576 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:56.349325 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="99eb7f21-72ec-4d9b-9ab8-b800ec9fd483" containerName="util" Apr 21 04:06:56.349576 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:56.349336 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3d72c2a9-f78a-41a6-b215-769fc412ff03" containerName="pull" Apr 21 04:06:56.349576 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:56.349345 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d72c2a9-f78a-41a6-b215-769fc412ff03" containerName="pull" Apr 21 04:06:56.349576 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:56.349355 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d29cdec6-bc44-4468-be4a-6696bb7bc4a9" containerName="extract" Apr 21 04:06:56.349576 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:56.349363 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="d29cdec6-bc44-4468-be4a-6696bb7bc4a9" containerName="extract" Apr 21 04:06:56.349576 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:56.349374 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d29cdec6-bc44-4468-be4a-6696bb7bc4a9" containerName="util" Apr 21 04:06:56.349576 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:56.349382 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="d29cdec6-bc44-4468-be4a-6696bb7bc4a9" containerName="util" Apr 21 04:06:56.349576 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:56.349400 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99eb7f21-72ec-4d9b-9ab8-b800ec9fd483" containerName="pull" Apr 21 04:06:56.349576 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:56.349409 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="99eb7f21-72ec-4d9b-9ab8-b800ec9fd483" containerName="pull" Apr 21 04:06:56.349576 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:56.349428 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d29cdec6-bc44-4468-be4a-6696bb7bc4a9" containerName="pull" Apr 21 04:06:56.349576 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:56.349436 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="d29cdec6-bc44-4468-be4a-6696bb7bc4a9" containerName="pull" Apr 21 04:06:56.349576 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:56.349539 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="99eb7f21-72ec-4d9b-9ab8-b800ec9fd483" containerName="extract" Apr 21 04:06:56.349576 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:56.349553 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="06aa4f62-8552-4318-9c55-d065c4f6ca0d" containerName="extract" Apr 21 04:06:56.349576 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:56.349567 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="3d72c2a9-f78a-41a6-b215-769fc412ff03" containerName="extract" Apr 21 04:06:56.349576 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:56.349577 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="d29cdec6-bc44-4468-be4a-6696bb7bc4a9" containerName="extract" Apr 21 04:06:56.360863 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:56.360835 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-pxrf7" Apr 21 04:06:56.364000 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:56.363967 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-pxrf7"] Apr 21 04:06:56.367606 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:56.367580 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 21 04:06:56.367754 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:56.367630 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 21 04:06:56.367754 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:56.367669 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-9q55p\"" Apr 21 04:06:56.431974 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:56.431934 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xp87j\" (UniqueName: \"kubernetes.io/projected/bf70dc95-c70a-4ad1-9c9e-621f0cbc767f-kube-api-access-xp87j\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-pxrf7\" (UID: \"bf70dc95-c70a-4ad1-9c9e-621f0cbc767f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-pxrf7" Apr 21 04:06:56.432159 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:56.431985 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/bf70dc95-c70a-4ad1-9c9e-621f0cbc767f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-pxrf7\" (UID: \"bf70dc95-c70a-4ad1-9c9e-621f0cbc767f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-pxrf7" Apr 21 04:06:56.532667 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:56.532630 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xp87j\" (UniqueName: \"kubernetes.io/projected/bf70dc95-c70a-4ad1-9c9e-621f0cbc767f-kube-api-access-xp87j\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-pxrf7\" (UID: \"bf70dc95-c70a-4ad1-9c9e-621f0cbc767f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-pxrf7" Apr 21 04:06:56.532874 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:56.532685 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/bf70dc95-c70a-4ad1-9c9e-621f0cbc767f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-pxrf7\" (UID: \"bf70dc95-c70a-4ad1-9c9e-621f0cbc767f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-pxrf7" Apr 21 04:06:56.533103 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:56.533084 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/bf70dc95-c70a-4ad1-9c9e-621f0cbc767f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-pxrf7\" (UID: \"bf70dc95-c70a-4ad1-9c9e-621f0cbc767f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-pxrf7" Apr 21 04:06:56.541402 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:56.541373 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xp87j\" (UniqueName: \"kubernetes.io/projected/bf70dc95-c70a-4ad1-9c9e-621f0cbc767f-kube-api-access-xp87j\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-pxrf7\" (UID: \"bf70dc95-c70a-4ad1-9c9e-621f0cbc767f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-pxrf7" Apr 21 04:06:56.671933 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:56.671839 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-pxrf7" Apr 21 04:06:56.808705 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:56.808675 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-pxrf7"] Apr 21 04:06:56.811471 ip-10-0-134-136 kubenswrapper[2578]: W0421 04:06:56.811433 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf70dc95_c70a_4ad1_9c9e_621f0cbc767f.slice/crio-e669d4010aeb51bb21bc5e92f66379423e935cab013fba62fc856dc9361aa3ac WatchSource:0}: Error finding container e669d4010aeb51bb21bc5e92f66379423e935cab013fba62fc856dc9361aa3ac: Status 404 returned error can't find the container with id e669d4010aeb51bb21bc5e92f66379423e935cab013fba62fc856dc9361aa3ac Apr 21 04:06:56.891082 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:56.891050 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-pxrf7" event={"ID":"bf70dc95-c70a-4ad1-9c9e-621f0cbc767f","Type":"ContainerStarted","Data":"e669d4010aeb51bb21bc5e92f66379423e935cab013fba62fc856dc9361aa3ac"} Apr 21 04:06:59.546870 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:59.546827 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-mmqxf"] Apr 21 04:06:59.550627 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:59.550594 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-mmqxf" Apr 21 04:06:59.553294 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:59.553259 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-kjl2g\"" Apr 21 04:06:59.558176 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:59.558147 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-mmqxf"] Apr 21 04:06:59.665522 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:59.665482 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttdb4\" (UniqueName: \"kubernetes.io/projected/353bd64b-3ae5-4bc7-8c28-649edb6c99a2-kube-api-access-ttdb4\") pod \"limitador-operator-controller-manager-c7fb4c8d5-mmqxf\" (UID: \"353bd64b-3ae5-4bc7-8c28-649edb6c99a2\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-mmqxf" Apr 21 04:06:59.766227 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:59.766190 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ttdb4\" (UniqueName: \"kubernetes.io/projected/353bd64b-3ae5-4bc7-8c28-649edb6c99a2-kube-api-access-ttdb4\") pod \"limitador-operator-controller-manager-c7fb4c8d5-mmqxf\" (UID: \"353bd64b-3ae5-4bc7-8c28-649edb6c99a2\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-mmqxf" Apr 21 04:06:59.779610 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:59.779577 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttdb4\" (UniqueName: \"kubernetes.io/projected/353bd64b-3ae5-4bc7-8c28-649edb6c99a2-kube-api-access-ttdb4\") pod \"limitador-operator-controller-manager-c7fb4c8d5-mmqxf\" (UID: \"353bd64b-3ae5-4bc7-8c28-649edb6c99a2\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-mmqxf" Apr 21 04:06:59.865602 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:06:59.865507 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-mmqxf" Apr 21 04:07:00.898202 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:07:00.898164 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-mmqxf"] Apr 21 04:07:00.899934 ip-10-0-134-136 kubenswrapper[2578]: W0421 04:07:00.899908 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod353bd64b_3ae5_4bc7_8c28_649edb6c99a2.slice/crio-4eec6e9e071505b0f2525c8642ab8e400d1f6ae4a995fc463684845b1b63f985 WatchSource:0}: Error finding container 4eec6e9e071505b0f2525c8642ab8e400d1f6ae4a995fc463684845b1b63f985: Status 404 returned error can't find the container with id 4eec6e9e071505b0f2525c8642ab8e400d1f6ae4a995fc463684845b1b63f985 Apr 21 04:07:00.909236 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:07:00.909205 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-mmqxf" event={"ID":"353bd64b-3ae5-4bc7-8c28-649edb6c99a2","Type":"ContainerStarted","Data":"4eec6e9e071505b0f2525c8642ab8e400d1f6ae4a995fc463684845b1b63f985"} Apr 21 04:07:01.917024 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:07:01.916980 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-pxrf7" event={"ID":"bf70dc95-c70a-4ad1-9c9e-621f0cbc767f","Type":"ContainerStarted","Data":"4f5973ffd55637e61a158964299acbd7ed0ede5e7bf91f2bf4f7b588d45704e6"} Apr 21 04:07:01.917839 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:07:01.917812 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-pxrf7" Apr 21 04:07:01.938821 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:07:01.938752 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-pxrf7" podStartSLOduration=1.902137716 podStartE2EDuration="5.938733081s" podCreationTimestamp="2026-04-21 04:06:56 +0000 UTC" firstStartedPulling="2026-04-21 04:06:56.813885566 +0000 UTC m=+560.483324620" lastFinishedPulling="2026-04-21 04:07:00.85048092 +0000 UTC m=+564.519919985" observedRunningTime="2026-04-21 04:07:01.937072158 +0000 UTC m=+565.606511239" watchObservedRunningTime="2026-04-21 04:07:01.938733081 +0000 UTC m=+565.608172169" Apr 21 04:07:03.926843 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:07:03.926809 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-mmqxf" event={"ID":"353bd64b-3ae5-4bc7-8c28-649edb6c99a2","Type":"ContainerStarted","Data":"487a43072f45c794c724da06a452d2901fda881e8b7afb1d4ace2e2fb64c71a0"} Apr 21 04:07:03.927241 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:07:03.926935 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-mmqxf" Apr 21 04:07:03.944732 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:07:03.944680 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-mmqxf" podStartSLOduration=2.533420637 podStartE2EDuration="4.944663559s" podCreationTimestamp="2026-04-21 04:06:59 +0000 UTC" firstStartedPulling="2026-04-21 04:07:00.902134457 +0000 UTC m=+564.571573509" lastFinishedPulling="2026-04-21 04:07:03.313377363 +0000 UTC m=+566.982816431" observedRunningTime="2026-04-21 04:07:03.942809718 +0000 UTC m=+567.612248793" watchObservedRunningTime="2026-04-21 04:07:03.944663559 +0000 UTC m=+567.614102634" Apr 21 04:07:12.923603 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:07:12.923574 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-pxrf7" Apr 21 04:07:13.932895 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:07:13.932850 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5d49d4b6dc-66ft2" podUID="801f9aa9-c09f-4657-a23f-e88496dbbdd1" containerName="console" containerID="cri-o://8c271bb56c22ace11851a27965393ec0547f9d509c32b0eef3483db622db7c12" gracePeriod=15 Apr 21 04:07:14.181482 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:07:14.181459 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5d49d4b6dc-66ft2_801f9aa9-c09f-4657-a23f-e88496dbbdd1/console/0.log" Apr 21 04:07:14.181614 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:07:14.181524 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d49d4b6dc-66ft2" Apr 21 04:07:14.304539 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:07:14.304501 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqplv\" (UniqueName: \"kubernetes.io/projected/801f9aa9-c09f-4657-a23f-e88496dbbdd1-kube-api-access-gqplv\") pod \"801f9aa9-c09f-4657-a23f-e88496dbbdd1\" (UID: \"801f9aa9-c09f-4657-a23f-e88496dbbdd1\") " Apr 21 04:07:14.304712 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:07:14.304550 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/801f9aa9-c09f-4657-a23f-e88496dbbdd1-console-oauth-config\") pod \"801f9aa9-c09f-4657-a23f-e88496dbbdd1\" (UID: \"801f9aa9-c09f-4657-a23f-e88496dbbdd1\") " Apr 21 04:07:14.304712 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:07:14.304597 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/801f9aa9-c09f-4657-a23f-e88496dbbdd1-oauth-serving-cert\") pod \"801f9aa9-c09f-4657-a23f-e88496dbbdd1\" (UID: \"801f9aa9-c09f-4657-a23f-e88496dbbdd1\") " Apr 21 04:07:14.304712 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:07:14.304619 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/801f9aa9-c09f-4657-a23f-e88496dbbdd1-console-config\") pod \"801f9aa9-c09f-4657-a23f-e88496dbbdd1\" (UID: \"801f9aa9-c09f-4657-a23f-e88496dbbdd1\") " Apr 21 04:07:14.304712 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:07:14.304681 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/801f9aa9-c09f-4657-a23f-e88496dbbdd1-service-ca\") pod \"801f9aa9-c09f-4657-a23f-e88496dbbdd1\" (UID: \"801f9aa9-c09f-4657-a23f-e88496dbbdd1\") " Apr 21 04:07:14.304712 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:07:14.304704 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/801f9aa9-c09f-4657-a23f-e88496dbbdd1-console-serving-cert\") pod \"801f9aa9-c09f-4657-a23f-e88496dbbdd1\" (UID: \"801f9aa9-c09f-4657-a23f-e88496dbbdd1\") " Apr 21 04:07:14.304981 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:07:14.304762 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/801f9aa9-c09f-4657-a23f-e88496dbbdd1-trusted-ca-bundle\") pod \"801f9aa9-c09f-4657-a23f-e88496dbbdd1\" (UID: \"801f9aa9-c09f-4657-a23f-e88496dbbdd1\") " Apr 21 04:07:14.305156 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:07:14.305108 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/801f9aa9-c09f-4657-a23f-e88496dbbdd1-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "801f9aa9-c09f-4657-a23f-e88496dbbdd1" (UID: "801f9aa9-c09f-4657-a23f-e88496dbbdd1"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:07:14.305156 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:07:14.305116 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/801f9aa9-c09f-4657-a23f-e88496dbbdd1-console-config" (OuterVolumeSpecName: "console-config") pod "801f9aa9-c09f-4657-a23f-e88496dbbdd1" (UID: "801f9aa9-c09f-4657-a23f-e88496dbbdd1"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:07:14.305156 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:07:14.305126 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/801f9aa9-c09f-4657-a23f-e88496dbbdd1-service-ca" (OuterVolumeSpecName: "service-ca") pod "801f9aa9-c09f-4657-a23f-e88496dbbdd1" (UID: "801f9aa9-c09f-4657-a23f-e88496dbbdd1"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:07:14.305451 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:07:14.305428 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/801f9aa9-c09f-4657-a23f-e88496dbbdd1-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "801f9aa9-c09f-4657-a23f-e88496dbbdd1" (UID: "801f9aa9-c09f-4657-a23f-e88496dbbdd1"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:07:14.306831 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:07:14.306811 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/801f9aa9-c09f-4657-a23f-e88496dbbdd1-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "801f9aa9-c09f-4657-a23f-e88496dbbdd1" (UID: "801f9aa9-c09f-4657-a23f-e88496dbbdd1"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:07:14.306902 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:07:14.306846 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/801f9aa9-c09f-4657-a23f-e88496dbbdd1-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "801f9aa9-c09f-4657-a23f-e88496dbbdd1" (UID: "801f9aa9-c09f-4657-a23f-e88496dbbdd1"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:07:14.306902 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:07:14.306887 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/801f9aa9-c09f-4657-a23f-e88496dbbdd1-kube-api-access-gqplv" (OuterVolumeSpecName: "kube-api-access-gqplv") pod "801f9aa9-c09f-4657-a23f-e88496dbbdd1" (UID: "801f9aa9-c09f-4657-a23f-e88496dbbdd1"). InnerVolumeSpecName "kube-api-access-gqplv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:07:14.405874 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:07:14.405836 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gqplv\" (UniqueName: \"kubernetes.io/projected/801f9aa9-c09f-4657-a23f-e88496dbbdd1-kube-api-access-gqplv\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:07:14.405874 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:07:14.405867 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/801f9aa9-c09f-4657-a23f-e88496dbbdd1-console-oauth-config\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:07:14.405874 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:07:14.405876 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/801f9aa9-c09f-4657-a23f-e88496dbbdd1-oauth-serving-cert\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:07:14.405874 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:07:14.405885 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/801f9aa9-c09f-4657-a23f-e88496dbbdd1-console-config\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:07:14.406143 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:07:14.405894 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/801f9aa9-c09f-4657-a23f-e88496dbbdd1-service-ca\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:07:14.406143 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:07:14.405903 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/801f9aa9-c09f-4657-a23f-e88496dbbdd1-console-serving-cert\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:07:14.406143 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:07:14.405911 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/801f9aa9-c09f-4657-a23f-e88496dbbdd1-trusted-ca-bundle\") on node \"ip-10-0-134-136.ec2.internal\" DevicePath \"\"" Apr 21 04:07:14.933995 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:07:14.933961 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-mmqxf" Apr 21 04:07:14.969256 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:07:14.969226 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5d49d4b6dc-66ft2_801f9aa9-c09f-4657-a23f-e88496dbbdd1/console/0.log" Apr 21 04:07:14.969465 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:07:14.969292 2578 generic.go:358] "Generic (PLEG): container finished" podID="801f9aa9-c09f-4657-a23f-e88496dbbdd1" containerID="8c271bb56c22ace11851a27965393ec0547f9d509c32b0eef3483db622db7c12" exitCode=2 Apr 21 04:07:14.969465 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:07:14.969333 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d49d4b6dc-66ft2" event={"ID":"801f9aa9-c09f-4657-a23f-e88496dbbdd1","Type":"ContainerDied","Data":"8c271bb56c22ace11851a27965393ec0547f9d509c32b0eef3483db622db7c12"} Apr 21 04:07:14.969465 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:07:14.969382 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d49d4b6dc-66ft2" event={"ID":"801f9aa9-c09f-4657-a23f-e88496dbbdd1","Type":"ContainerDied","Data":"2da1e4503eb09021ad4ee943c33b744a4f70ebae46233d474344e8c8ffc0254e"} Apr 21 04:07:14.969465 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:07:14.969389 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d49d4b6dc-66ft2" Apr 21 04:07:14.969465 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:07:14.969402 2578 scope.go:117] "RemoveContainer" containerID="8c271bb56c22ace11851a27965393ec0547f9d509c32b0eef3483db622db7c12" Apr 21 04:07:14.978720 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:07:14.978699 2578 scope.go:117] "RemoveContainer" containerID="8c271bb56c22ace11851a27965393ec0547f9d509c32b0eef3483db622db7c12" Apr 21 04:07:14.979009 ip-10-0-134-136 kubenswrapper[2578]: E0421 04:07:14.978990 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c271bb56c22ace11851a27965393ec0547f9d509c32b0eef3483db622db7c12\": container with ID starting with 8c271bb56c22ace11851a27965393ec0547f9d509c32b0eef3483db622db7c12 not found: ID does not exist" containerID="8c271bb56c22ace11851a27965393ec0547f9d509c32b0eef3483db622db7c12" Apr 21 04:07:14.979077 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:07:14.979027 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c271bb56c22ace11851a27965393ec0547f9d509c32b0eef3483db622db7c12"} err="failed to get container status \"8c271bb56c22ace11851a27965393ec0547f9d509c32b0eef3483db622db7c12\": rpc error: code = NotFound desc = could not find container \"8c271bb56c22ace11851a27965393ec0547f9d509c32b0eef3483db622db7c12\": container with ID starting with 8c271bb56c22ace11851a27965393ec0547f9d509c32b0eef3483db622db7c12 not found: ID does not exist" Apr 21 04:07:14.991519 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:07:14.991482 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5d49d4b6dc-66ft2"] Apr 21 04:07:14.994296 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:07:14.994255 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5d49d4b6dc-66ft2"] Apr 21 04:07:16.929614 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:07:16.929582 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="801f9aa9-c09f-4657-a23f-e88496dbbdd1" path="/var/lib/kubelet/pods/801f9aa9-c09f-4657-a23f-e88496dbbdd1/volumes" Apr 21 04:07:36.853873 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:07:36.853843 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-82wml_37b63820-2bf0-4a5b-82c5-6b56ab7689b7/ovn-acl-logging/0.log" Apr 21 04:07:36.855864 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:07:36.855842 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-82wml_37b63820-2bf0-4a5b-82c5-6b56ab7689b7/ovn-acl-logging/0.log" Apr 21 04:12:36.887165 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:12:36.887131 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-82wml_37b63820-2bf0-4a5b-82c5-6b56ab7689b7/ovn-acl-logging/0.log" Apr 21 04:12:36.889075 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:12:36.889045 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-82wml_37b63820-2bf0-4a5b-82c5-6b56ab7689b7/ovn-acl-logging/0.log" Apr 21 04:17:36.918459 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:17:36.918423 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-82wml_37b63820-2bf0-4a5b-82c5-6b56ab7689b7/ovn-acl-logging/0.log" Apr 21 04:17:36.921333 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:17:36.921309 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-82wml_37b63820-2bf0-4a5b-82c5-6b56ab7689b7/ovn-acl-logging/0.log" Apr 21 04:18:02.112622 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:02.112592 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-pxrf7_bf70dc95-c70a-4ad1-9c9e-621f0cbc767f/manager/0.log" Apr 21 04:18:21.355873 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:21.355773 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-7cd77c7ffd-gzshw_841ef081-9046-47bb-8f1e-7b662ff2b695/discovery/0.log" Apr 21 04:18:21.367959 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:21.367928 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-j9s6j_0cf4f637-103b-4401-9d5e-1ab49201634c/istio-proxy/0.log" Apr 21 04:18:21.382264 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:21.382235 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-598f8f588b-lr9v8_c8c7e63b-ee59-4ec9-bf86-150c2642daa7/router/0.log" Apr 21 04:18:27.075947 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:27.075896 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dhdf9/must-gather-5jcr4"] Apr 21 04:18:27.076369 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:27.076354 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="801f9aa9-c09f-4657-a23f-e88496dbbdd1" containerName="console" Apr 21 04:18:27.076419 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:27.076372 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="801f9aa9-c09f-4657-a23f-e88496dbbdd1" containerName="console" Apr 21 04:18:27.076454 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:27.076433 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="801f9aa9-c09f-4657-a23f-e88496dbbdd1" containerName="console" Apr 21 04:18:27.079492 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:27.079478 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dhdf9/must-gather-5jcr4" Apr 21 04:18:27.082175 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:27.082151 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-dhdf9\"/\"kube-root-ca.crt\"" Apr 21 04:18:27.082311 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:27.082158 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-dhdf9\"/\"default-dockercfg-h8wlt\"" Apr 21 04:18:27.083141 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:27.083082 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-dhdf9\"/\"openshift-service-ca.crt\"" Apr 21 04:18:27.086441 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:27.086418 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dhdf9/must-gather-5jcr4"] Apr 21 04:18:27.252500 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:27.252453 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4c90157b-a8b3-4a8e-802c-685717fb24d7-must-gather-output\") pod \"must-gather-5jcr4\" (UID: \"4c90157b-a8b3-4a8e-802c-685717fb24d7\") " pod="openshift-must-gather-dhdf9/must-gather-5jcr4" Apr 21 04:18:27.252687 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:27.252582 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sck8k\" (UniqueName: \"kubernetes.io/projected/4c90157b-a8b3-4a8e-802c-685717fb24d7-kube-api-access-sck8k\") pod \"must-gather-5jcr4\" (UID: \"4c90157b-a8b3-4a8e-802c-685717fb24d7\") " pod="openshift-must-gather-dhdf9/must-gather-5jcr4" Apr 21 04:18:27.353985 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:27.353891 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sck8k\" (UniqueName: \"kubernetes.io/projected/4c90157b-a8b3-4a8e-802c-685717fb24d7-kube-api-access-sck8k\") pod \"must-gather-5jcr4\" (UID: \"4c90157b-a8b3-4a8e-802c-685717fb24d7\") " pod="openshift-must-gather-dhdf9/must-gather-5jcr4" Apr 21 04:18:27.353985 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:27.353964 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4c90157b-a8b3-4a8e-802c-685717fb24d7-must-gather-output\") pod \"must-gather-5jcr4\" (UID: \"4c90157b-a8b3-4a8e-802c-685717fb24d7\") " pod="openshift-must-gather-dhdf9/must-gather-5jcr4" Apr 21 04:18:27.354310 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:27.354269 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4c90157b-a8b3-4a8e-802c-685717fb24d7-must-gather-output\") pod \"must-gather-5jcr4\" (UID: \"4c90157b-a8b3-4a8e-802c-685717fb24d7\") " pod="openshift-must-gather-dhdf9/must-gather-5jcr4" Apr 21 04:18:27.362162 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:27.362137 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sck8k\" (UniqueName: \"kubernetes.io/projected/4c90157b-a8b3-4a8e-802c-685717fb24d7-kube-api-access-sck8k\") pod \"must-gather-5jcr4\" (UID: \"4c90157b-a8b3-4a8e-802c-685717fb24d7\") " pod="openshift-must-gather-dhdf9/must-gather-5jcr4" Apr 21 04:18:27.399075 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:27.399036 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dhdf9/must-gather-5jcr4" Apr 21 04:18:27.726798 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:27.726770 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dhdf9/must-gather-5jcr4"] Apr 21 04:18:27.728485 ip-10-0-134-136 kubenswrapper[2578]: W0421 04:18:27.728454 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c90157b_a8b3_4a8e_802c_685717fb24d7.slice/crio-f5f3fe3c3205be8c533655da287f9f4f1e80749f5f2b9d357fb92e7c60dd9130 WatchSource:0}: Error finding container f5f3fe3c3205be8c533655da287f9f4f1e80749f5f2b9d357fb92e7c60dd9130: Status 404 returned error can't find the container with id f5f3fe3c3205be8c533655da287f9f4f1e80749f5f2b9d357fb92e7c60dd9130 Apr 21 04:18:27.730127 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:27.730110 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 04:18:28.571260 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:28.571217 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dhdf9/must-gather-5jcr4" event={"ID":"4c90157b-a8b3-4a8e-802c-685717fb24d7","Type":"ContainerStarted","Data":"020769608d7e60c0d5dcecf33a67ae2ed02d486bb971c2f6c1a9157698261be1"} Apr 21 04:18:28.571771 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:28.571268 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dhdf9/must-gather-5jcr4" event={"ID":"4c90157b-a8b3-4a8e-802c-685717fb24d7","Type":"ContainerStarted","Data":"f5f3fe3c3205be8c533655da287f9f4f1e80749f5f2b9d357fb92e7c60dd9130"} Apr 21 04:18:29.577128 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:29.577089 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dhdf9/must-gather-5jcr4" event={"ID":"4c90157b-a8b3-4a8e-802c-685717fb24d7","Type":"ContainerStarted","Data":"83ee6406704e1320bfe40869f815e3a40971da68d1bfb6a57a5aa1a36fdd0f9c"} Apr 21 04:18:29.593243 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:29.593185 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-dhdf9/must-gather-5jcr4" podStartSLOduration=1.916349457 podStartE2EDuration="2.593171166s" podCreationTimestamp="2026-04-21 04:18:27 +0000 UTC" firstStartedPulling="2026-04-21 04:18:27.73023226 +0000 UTC m=+1251.399671314" lastFinishedPulling="2026-04-21 04:18:28.40705397 +0000 UTC m=+1252.076493023" observedRunningTime="2026-04-21 04:18:29.591461345 +0000 UTC m=+1253.260900420" watchObservedRunningTime="2026-04-21 04:18:29.593171166 +0000 UTC m=+1253.262610241" Apr 21 04:18:29.849583 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:29.849481 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-t75ld_ee91e745-0c10-4d73-b8fd-a96e27ea14b8/global-pull-secret-syncer/0.log" Apr 21 04:18:29.931887 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:29.931858 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-7vx69_adbc5b28-9686-4704-b170-0ada296e15b8/konnectivity-agent/0.log" Apr 21 04:18:30.019072 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:30.019040 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-134-136.ec2.internal_68f04465e53eef24d16ecd9de5ad5a12/haproxy/0.log" Apr 21 04:18:34.061609 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:34.061561 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-pxrf7_bf70dc95-c70a-4ad1-9c9e-621f0cbc767f/manager/0.log" Apr 21 04:18:34.125176 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:34.125143 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-mmqxf_353bd64b-3ae5-4bc7-8c28-649edb6c99a2/manager/0.log" Apr 21 04:18:35.004251 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:35.004216 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_03548feb-c89f-456f-97c0-dd5867c02ca1/alertmanager/0.log" Apr 21 04:18:35.031149 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:35.031112 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_03548feb-c89f-456f-97c0-dd5867c02ca1/config-reloader/0.log" Apr 21 04:18:35.060748 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:35.060716 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_03548feb-c89f-456f-97c0-dd5867c02ca1/kube-rbac-proxy-web/0.log" Apr 21 04:18:35.084996 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:35.084962 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_03548feb-c89f-456f-97c0-dd5867c02ca1/kube-rbac-proxy/0.log" Apr 21 04:18:35.112464 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:35.112424 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_03548feb-c89f-456f-97c0-dd5867c02ca1/kube-rbac-proxy-metric/0.log" Apr 21 04:18:35.137188 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:35.137161 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_03548feb-c89f-456f-97c0-dd5867c02ca1/prom-label-proxy/0.log" Apr 21 04:18:35.163582 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:35.163556 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_03548feb-c89f-456f-97c0-dd5867c02ca1/init-config-reloader/0.log" Apr 21 04:18:35.206433 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:35.206403 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-9mdst_474836e5-bd94-4a4c-a1ec-ee329f804cb6/cluster-monitoring-operator/0.log" Apr 21 04:18:35.301536 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:35.301502 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-fd55fd488-h8sm5_972cb6a5-a2e1-4220-b72b-f7d3cf55ce7a/metrics-server/0.log" Apr 21 04:18:35.327329 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:35.327297 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-r88vj_976a9808-8386-44d6-964f-4f35e8b7bf8f/monitoring-plugin/0.log" Apr 21 04:18:35.499619 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:35.499591 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zmnts_aa4733fb-75e2-4c77-bd5d-7ea904966689/node-exporter/0.log" Apr 21 04:18:35.522969 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:35.522943 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zmnts_aa4733fb-75e2-4c77-bd5d-7ea904966689/kube-rbac-proxy/0.log" Apr 21 04:18:35.544879 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:35.544852 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zmnts_aa4733fb-75e2-4c77-bd5d-7ea904966689/init-textfile/0.log" Apr 21 04:18:35.842818 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:35.842782 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-jrv6d_5cbfce1e-fdfb-4e8b-b9d6-626294c831f0/prometheus-operator/0.log" Apr 21 04:18:35.861935 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:35.861855 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-jrv6d_5cbfce1e-fdfb-4e8b-b9d6-626294c831f0/kube-rbac-proxy/0.log" Apr 21 04:18:35.920914 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:35.920867 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-97bc57f6c-5m8lx_67d68952-11d7-4886-b63f-c39a5c3ef6d9/telemeter-client/0.log" Apr 21 04:18:35.944708 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:35.944675 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-97bc57f6c-5m8lx_67d68952-11d7-4886-b63f-c39a5c3ef6d9/reload/0.log" Apr 21 04:18:35.965644 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:35.965560 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-97bc57f6c-5m8lx_67d68952-11d7-4886-b63f-c39a5c3ef6d9/kube-rbac-proxy/0.log" Apr 21 04:18:35.997682 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:35.997654 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6c69947b7-mcmmj_b59fc3b6-a6a1-4455-b9b3-ae8335e2a143/thanos-query/0.log" Apr 21 04:18:36.020144 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:36.020111 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6c69947b7-mcmmj_b59fc3b6-a6a1-4455-b9b3-ae8335e2a143/kube-rbac-proxy-web/0.log" Apr 21 04:18:36.040592 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:36.040562 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6c69947b7-mcmmj_b59fc3b6-a6a1-4455-b9b3-ae8335e2a143/kube-rbac-proxy/0.log" Apr 21 04:18:36.061170 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:36.061141 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6c69947b7-mcmmj_b59fc3b6-a6a1-4455-b9b3-ae8335e2a143/prom-label-proxy/0.log" Apr 21 04:18:36.083539 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:36.083510 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6c69947b7-mcmmj_b59fc3b6-a6a1-4455-b9b3-ae8335e2a143/kube-rbac-proxy-rules/0.log" Apr 21 04:18:36.107724 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:36.107675 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6c69947b7-mcmmj_b59fc3b6-a6a1-4455-b9b3-ae8335e2a143/kube-rbac-proxy-metrics/0.log" Apr 21 04:18:38.209781 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:38.209751 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-875d59bd6-mqc5b_f3bccf2e-9c8f-49d5-a797-6235b2ebd1b3/console/0.log" Apr 21 04:18:38.240950 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:38.240917 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-qf2jm_07645b09-21c7-466b-a46d-b48a72d9c654/download-server/0.log" Apr 21 04:18:38.720188 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:38.720145 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dhdf9/perf-node-gather-daemonset-dxnmw"] Apr 21 04:18:38.726836 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:38.726805 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-dxnmw" Apr 21 04:18:38.731934 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:38.731898 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dhdf9/perf-node-gather-daemonset-dxnmw"] Apr 21 04:18:38.772694 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:38.772643 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrwrk\" (UniqueName: \"kubernetes.io/projected/feb9eea7-08a2-4de2-ac09-c343884f732b-kube-api-access-xrwrk\") pod \"perf-node-gather-daemonset-dxnmw\" (UID: \"feb9eea7-08a2-4de2-ac09-c343884f732b\") " pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-dxnmw" Apr 21 04:18:38.772909 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:38.772818 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/feb9eea7-08a2-4de2-ac09-c343884f732b-podres\") pod \"perf-node-gather-daemonset-dxnmw\" (UID: \"feb9eea7-08a2-4de2-ac09-c343884f732b\") " pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-dxnmw" Apr 21 04:18:38.772909 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:38.772882 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/feb9eea7-08a2-4de2-ac09-c343884f732b-sys\") pod \"perf-node-gather-daemonset-dxnmw\" (UID: \"feb9eea7-08a2-4de2-ac09-c343884f732b\") " pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-dxnmw" Apr 21 04:18:38.773021 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:38.772913 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/feb9eea7-08a2-4de2-ac09-c343884f732b-lib-modules\") pod \"perf-node-gather-daemonset-dxnmw\" (UID: \"feb9eea7-08a2-4de2-ac09-c343884f732b\") " pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-dxnmw" Apr 21 04:18:38.773021 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:38.773004 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/feb9eea7-08a2-4de2-ac09-c343884f732b-proc\") pod \"perf-node-gather-daemonset-dxnmw\" (UID: \"feb9eea7-08a2-4de2-ac09-c343884f732b\") " pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-dxnmw" Apr 21 04:18:38.874186 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:38.874151 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/feb9eea7-08a2-4de2-ac09-c343884f732b-sys\") pod \"perf-node-gather-daemonset-dxnmw\" (UID: \"feb9eea7-08a2-4de2-ac09-c343884f732b\") " pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-dxnmw" Apr 21 04:18:38.874398 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:38.874194 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/feb9eea7-08a2-4de2-ac09-c343884f732b-lib-modules\") pod \"perf-node-gather-daemonset-dxnmw\" (UID: \"feb9eea7-08a2-4de2-ac09-c343884f732b\") " pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-dxnmw" Apr 21 04:18:38.874398 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:38.874289 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/feb9eea7-08a2-4de2-ac09-c343884f732b-proc\") pod \"perf-node-gather-daemonset-dxnmw\" (UID: \"feb9eea7-08a2-4de2-ac09-c343884f732b\") " pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-dxnmw" Apr 21 04:18:38.874398 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:38.874297 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/feb9eea7-08a2-4de2-ac09-c343884f732b-sys\") pod \"perf-node-gather-daemonset-dxnmw\" (UID: \"feb9eea7-08a2-4de2-ac09-c343884f732b\") " pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-dxnmw" Apr 21 04:18:38.874398 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:38.874344 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xrwrk\" (UniqueName: \"kubernetes.io/projected/feb9eea7-08a2-4de2-ac09-c343884f732b-kube-api-access-xrwrk\") pod \"perf-node-gather-daemonset-dxnmw\" (UID: \"feb9eea7-08a2-4de2-ac09-c343884f732b\") " pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-dxnmw" Apr 21 04:18:38.874398 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:38.874369 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/feb9eea7-08a2-4de2-ac09-c343884f732b-proc\") pod \"perf-node-gather-daemonset-dxnmw\" (UID: \"feb9eea7-08a2-4de2-ac09-c343884f732b\") " pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-dxnmw" Apr 21 04:18:38.874595 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:38.874413 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/feb9eea7-08a2-4de2-ac09-c343884f732b-podres\") pod \"perf-node-gather-daemonset-dxnmw\" (UID: \"feb9eea7-08a2-4de2-ac09-c343884f732b\") " pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-dxnmw" Apr 21 04:18:38.874595 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:38.874416 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/feb9eea7-08a2-4de2-ac09-c343884f732b-lib-modules\") pod \"perf-node-gather-daemonset-dxnmw\" (UID: \"feb9eea7-08a2-4de2-ac09-c343884f732b\") " pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-dxnmw" Apr 21 04:18:38.874595 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:38.874548 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/feb9eea7-08a2-4de2-ac09-c343884f732b-podres\") pod \"perf-node-gather-daemonset-dxnmw\" (UID: \"feb9eea7-08a2-4de2-ac09-c343884f732b\") " pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-dxnmw" Apr 21 04:18:38.884362 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:38.883255 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrwrk\" (UniqueName: \"kubernetes.io/projected/feb9eea7-08a2-4de2-ac09-c343884f732b-kube-api-access-xrwrk\") pod \"perf-node-gather-daemonset-dxnmw\" (UID: \"feb9eea7-08a2-4de2-ac09-c343884f732b\") " pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-dxnmw" Apr 21 04:18:39.045868 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:39.045832 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-dxnmw" Apr 21 04:18:39.201062 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:39.201038 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dhdf9/perf-node-gather-daemonset-dxnmw"] Apr 21 04:18:39.202585 ip-10-0-134-136 kubenswrapper[2578]: W0421 04:18:39.202551 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podfeb9eea7_08a2_4de2_ac09_c343884f732b.slice/crio-b728b2b0496d555a35c20091f94e32954179ea4f828674a9c209ceac2b4384b8 WatchSource:0}: Error finding container b728b2b0496d555a35c20091f94e32954179ea4f828674a9c209ceac2b4384b8: Status 404 returned error can't find the container with id b728b2b0496d555a35c20091f94e32954179ea4f828674a9c209ceac2b4384b8 Apr 21 04:18:39.433969 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:39.433942 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lk9pj_5b9715dc-7dd5-46ea-961a-2f02107a7655/dns/0.log" Apr 21 04:18:39.454118 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:39.454085 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lk9pj_5b9715dc-7dd5-46ea-961a-2f02107a7655/kube-rbac-proxy/0.log" Apr 21 04:18:39.539770 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:39.539744 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-jxqfz_f4acd3fa-d747-4053-9af8-c38066e122ab/dns-node-resolver/0.log" Apr 21 04:18:39.654992 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:39.654902 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-dxnmw" event={"ID":"feb9eea7-08a2-4de2-ac09-c343884f732b","Type":"ContainerStarted","Data":"c716f250f6ee353e0a60ac817038a4634af23c469dc97cc69059a2fabbe3d025"} Apr 21 04:18:39.654992 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:39.654942 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-dxnmw" event={"ID":"feb9eea7-08a2-4de2-ac09-c343884f732b","Type":"ContainerStarted","Data":"b728b2b0496d555a35c20091f94e32954179ea4f828674a9c209ceac2b4384b8"} Apr 21 04:18:39.655183 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:39.655008 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-dxnmw" Apr 21 04:18:39.674999 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:39.674944 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-dxnmw" podStartSLOduration=1.674930179 podStartE2EDuration="1.674930179s" podCreationTimestamp="2026-04-21 04:18:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:18:39.672398817 +0000 UTC m=+1263.341837932" watchObservedRunningTime="2026-04-21 04:18:39.674930179 +0000 UTC m=+1263.344369253" Apr 21 04:18:40.040222 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:40.040192 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-jpnv7_30292a27-0318-4f49-b26c-f54654ac07db/node-ca/0.log" Apr 21 04:18:40.833165 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:40.833116 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-7cd77c7ffd-gzshw_841ef081-9046-47bb-8f1e-7b662ff2b695/discovery/0.log" Apr 21 04:18:40.857817 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:40.857789 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-j9s6j_0cf4f637-103b-4401-9d5e-1ab49201634c/istio-proxy/0.log" Apr 21 04:18:40.883397 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:40.883360 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-598f8f588b-lr9v8_c8c7e63b-ee59-4ec9-bf86-150c2642daa7/router/0.log" Apr 21 04:18:41.322247 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:41.322217 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-sjcwm_e9350a7b-5fe0-4e30-8c05-5ee260472029/serve-healthcheck-canary/0.log" Apr 21 04:18:41.883943 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:41.883919 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-qsn72_f7da8ca3-9b52-4d03-bf6c-2d7bd826ccb1/kube-rbac-proxy/0.log" Apr 21 04:18:41.902478 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:41.902445 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-qsn72_f7da8ca3-9b52-4d03-bf6c-2d7bd826ccb1/exporter/0.log" Apr 21 04:18:41.922748 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:41.922702 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-qsn72_f7da8ca3-9b52-4d03-bf6c-2d7bd826ccb1/extractor/0.log" Apr 21 04:18:45.672800 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:45.672770 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-dxnmw" Apr 21 04:18:47.553659 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:47.553628 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-b2x7m_5f93c525-8d0e-4df1-99c5-c8091900d3af/migrator/0.log" Apr 21 04:18:47.577339 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:47.577314 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-b2x7m_5f93c525-8d0e-4df1-99c5-c8091900d3af/graceful-termination/0.log" Apr 21 04:18:47.981179 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:47.981149 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-pfkbf_047ff53e-3808-49b6-ad81-7bd15d251053/kube-storage-version-migrator-operator/1.log" Apr 21 04:18:47.982598 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:47.982528 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-pfkbf_047ff53e-3808-49b6-ad81-7bd15d251053/kube-storage-version-migrator-operator/0.log" Apr 21 04:18:49.097031 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:49.096955 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kmxxq_cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b/kube-multus-additional-cni-plugins/0.log" Apr 21 04:18:49.116957 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:49.116930 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kmxxq_cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b/egress-router-binary-copy/0.log" Apr 21 04:18:49.136162 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:49.136127 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kmxxq_cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b/cni-plugins/0.log" Apr 21 04:18:49.154378 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:49.154352 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kmxxq_cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b/bond-cni-plugin/0.log" Apr 21 04:18:49.173209 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:49.173183 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kmxxq_cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b/routeoverride-cni/0.log" Apr 21 04:18:49.192966 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:49.192941 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kmxxq_cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b/whereabouts-cni-bincopy/0.log" Apr 21 04:18:49.211997 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:49.211972 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kmxxq_cd7df76b-5df8-4dcb-8e6c-8e98f5533e3b/whereabouts-cni/0.log" Apr 21 04:18:49.562316 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:49.562258 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-j6rnj_7b9f5e8e-4f2f-4d07-87f7-ef56b124e3a6/kube-multus/0.log" Apr 21 04:18:49.633124 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:49.633088 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-2lrq9_6e98b17f-4794-44aa-8756-58a9bd9cb37a/network-metrics-daemon/0.log" Apr 21 04:18:49.651800 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:49.651769 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-2lrq9_6e98b17f-4794-44aa-8756-58a9bd9cb37a/kube-rbac-proxy/0.log" Apr 21 04:18:50.811084 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:50.811057 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-82wml_37b63820-2bf0-4a5b-82c5-6b56ab7689b7/ovn-controller/0.log" Apr 21 04:18:50.827374 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:50.827324 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-82wml_37b63820-2bf0-4a5b-82c5-6b56ab7689b7/ovn-acl-logging/0.log" Apr 21 04:18:50.839305 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:50.839218 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-82wml_37b63820-2bf0-4a5b-82c5-6b56ab7689b7/ovn-acl-logging/1.log" Apr 21 04:18:50.859429 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:50.859396 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-82wml_37b63820-2bf0-4a5b-82c5-6b56ab7689b7/kube-rbac-proxy-node/0.log" Apr 21 04:18:50.881508 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:50.881481 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-82wml_37b63820-2bf0-4a5b-82c5-6b56ab7689b7/kube-rbac-proxy-ovn-metrics/0.log" Apr 21 04:18:50.900300 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:50.900259 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-82wml_37b63820-2bf0-4a5b-82c5-6b56ab7689b7/northd/0.log" Apr 21 04:18:50.920119 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:50.920089 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-82wml_37b63820-2bf0-4a5b-82c5-6b56ab7689b7/nbdb/0.log" Apr 21 04:18:50.939822 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:50.939797 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-82wml_37b63820-2bf0-4a5b-82c5-6b56ab7689b7/sbdb/0.log" Apr 21 04:18:51.115419 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:51.115338 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-82wml_37b63820-2bf0-4a5b-82c5-6b56ab7689b7/ovnkube-controller/0.log" Apr 21 04:18:52.505488 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:52.505460 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-n6d2m_b5d6aff7-e2e1-4646-9d82-8c931e49196e/network-check-target-container/0.log" Apr 21 04:18:53.536074 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:53.536042 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-7j4lb_b4a64573-596e-4e34-a0d2-ec31f17a6ba5/iptables-alerter/0.log" Apr 21 04:18:54.293854 ip-10-0-134-136 kubenswrapper[2578]: I0421 04:18:54.293819 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-x9r2c_6de91b0f-af7b-43a5-8a43-cee7c5ba996e/tuned/0.log"